scispace - formally typeset
Search or ask a question

Showing papers by "DePaul University published in 2004"



Journal ArticleDOI
TL;DR: An updated theoretical model of applicant reactions to selection procedures is proposed and tested using meta-analysis as discussed by the authors, which indicated that applicants who hold positive perceptions about selection are more likely to view the organization favorably and report stronger intentions to accept job offers and recommend the employer to others.
Abstract: An updated theoretical model of applicant reactions to selection procedures is proposed and tested using meta-analysis. Results from 86 independent samples (N= 48,750) indicated that applicants who hold positive perceptions about selection are more likely to view the organization favorably and report stronger intentions to accept job offers and recommend the employer to others. Applicant perceptions were positively correlated with actual and perceived performance on selection tools and with self-perceptions. The average correlation between applicant perceptions and gender, age, and ethnic background was near zero. Face validity and perceived predictive validity were strong predictors of many applicant perceptions including procedural justice, distributive justice, attitudes towards tests, and attitudes towards selection. Interviews and work samples were perceived more favorably than cognitive ability tests, which were perceived more favorably than personality inventories, honesty tests, biodata, and graphology. The discussion identifies remaining theoretical and methodological issues as well as directions for future research.

672 citations


Journal ArticleDOI
TL;DR: In this article, the authors use data from a survey of small businesses to analyze the micro-level differences in the loan approval processes of large and small banks, and provide evidence that large banks employ standard criteria obtained from financial statemetits in the decision process, whereas small banks rely to a greater extent on information about the character of the borrower.
Abstract: The informational opacity of small businesses makes them an interesting area for the study of banks', lending practices and procedures. We use data from a survey of small businesses to analyze the micro level differences in the loan approval processes of large and small banks. We provide evidence that large banks ($1 billion or more in assets) employ standard criteria obtained from financial statemetits in the loan decision process, whereas small banks rely to a greater extent on information about the character of the borrower. These cookie-cutter and character approaches are compatible with the incentives and environments facing large and small banks.

537 citations


Journal ArticleDOI
TL;DR: Research on the association between stressors and symptoms of psychopathology in children and adolescents with a focus on measurement issues and prospective effects suggests that stressors predict changes in rates of symptoms of psychopathy over time.
Abstract: This article reviews existing research on the association between stressors and symptoms of psychopathology in children and adolescents with a focus on measurement issues and prospective effects. The first half of the article focuses on the measurement of stressors, emphasizing checklists and interviews. Available measures of stressful experiences are reviewed and critiqued. Results of this review reveal both substantial progress (i.e., development of valid stressor assessment tools) and remaining problems (i.e., inconsistent measurement across studies). The second half of this article reviews studies that have tested for prospective associations between stressors and symptoms of psychopathology in children and adolescents. Studies that have examined the prospective effects of recent or prior stressors on current psychological symptoms, while controlling for prior psychological symptoms, are reviewed. Results overall suggest that stressors predict changes in rates of symptoms of psychopathology in children and adolescents over time. Results also suggest that symptoms of psychopathology predict changes in rates of stressors over time. Implications of these findings are that conclusive evidence now exists for the importance of stressors in the development of child and adolescent psychopathology.

531 citations


Journal ArticleDOI
TL;DR: The diffusion model for 2-choice decisions (R. Ratcliff, 1978) was applied to data from lexical decision experiments in which word frequency, proportion of high- versus low-frequency words, and type of nonword were manipulated.
Abstract: The diffusion model for 2-choice decisions (R. Ratcliff, 1978) was applied to data from lexical decision experiments in which word frequency, proportion of high- versus low-frequency words, and type of nonword were manipulated. The model gave a good account of all of the dependent variables--accuracy, correct and error response times, and their distributions--and provided a description of how the component processes involved in the lexical decision task were affected by experimental variables. All of the variables investigated affected the rate at which information was accumulated from the stimuli--called drift rate in the model. The different drift rates observed for the various classes of stimuli can all be explained by a 2-dimensional signal-detection representation of stimulus information. The authors discuss how this representation and the diffusion model's decision process might be integrated with current models of lexical access.

493 citations


Proceedings ArticleDOI
07 Mar 2004
TL;DR: A set of techniques and algorithms to automatically discover policy anomalies in centralized and distributed legacy firewalls are presented, implemented in a software tool called the "Firewall Policy Advisor" that simplifies the management of filtering rules and maintains the security of next-generationFirewalls.
Abstract: Firewalls are core elements in network security. However, managing firewall rules, particularly in multi-firewall enterprise networks, has become a complex and error-prone task. Firewall filtering rules have to be written, ordered and distributed carefully in order to avoid firewall policy anomalies that might cause network vulnerability. Therefore, inserting or modifying filtering rules in any firewall requires thorough intra- and inter-firewall analysis to determine the proper rule placement and ordering in the firewalls. We identify all anomalies that could exist in a single- or multi-firewall environment. We also present a set of techniques and algorithms to automatically discover policy anomalies in centralized and distributed legacy firewalls. These techniques are implemented in a software tool called the "Firewall Policy Advisor" that simplifies the management of filtering rules and maintains the security of next-generation firewalls.

421 citations


Journal ArticleDOI
08 Jun 2004
TL;DR: A metric is studied between labelled Markov processes that has the property that processes are at zero distance if and only if they are bisimilar and is related, in spirit, to the Hutchinson metric.
Abstract: The notion of process equivalence of probabilistic processes is sensitive to the exact probabilities of transitions. Thus, a slight change in the transition probabilities will result in two equivalent processes being deemed no longer equivalent. This instability is due to the quantitative nature of probabilistic processes. In a situation where the process behavior has a quantitative aspect there should be a more robust approach to process equivalence. This paper studies a metric between labelled Markov processes. This metric has the property that processes are at zero distance if and only if they are bisimilar. The metric is inspired by earlier work on logics for characterizing bisimulation and is related, in spirit, to the Hutchinson metric.

364 citations


Journal ArticleDOI
TL;DR: To test the Symanzik-improved staggered-quark discretization formalism, experiment with a variety of nonperturbative calculations in QCD drawn from a restricted set of "gold-plated" quantities finds agreement to within statistical and systematic errors of 3% or less.
Abstract: The recently developed Symanzik-improved staggered-quark discretization allows unquenched lattice-QCD simulations with much smaller (and more realistic) quark masses than previously possible. To test this formalism, we compare experiment with a variety of nonperturbative calculations in QCD drawn from a restricted set of "gold-plated" quantities. We find agreement to within statistical and systematic errors of 3% or less. We discuss the implications for phenomenology and, in particular, for heavy-quark physics.

275 citations


Journal ArticleDOI
TL;DR: This article examined associations between different child characteristics and conflict, closeness, and dependency within teacher-student relationships and found that externalizing and internalizing symptomology demonstrated the strongest associations with the conflict and dependency relationship constructs.
Abstract: The purpose of this investigation was to examine associations between different child characteristics and conflict, closeness, and dependency within teacher–student relationships. The participants were primarily students of color from lower socioeconomic status backgrounds in a large urban school district. The strength of associations between student demographic variables, academic orientations, behavioral orientations, and aspects of teacher–student relationships was examined. Findings indicated that these variables accounted for a significant amount of variance in teacher ratings of conflict and dependency in teacher–student relationships. Externalizing and internalizing symptomology demonstrated the strongest associations with the conflict and dependency relationship constructs. Preliminary implications of these findings for teachers and school psychologists are explored. © 2004 Wiley Periodicals, Inc. Psychol Schs 41: 751–762, 2004.

268 citations


Journal ArticleDOI
TL;DR: The effects of aging on response time (RT) are examined in 2 lexical-decision experiments with young and older subjects (age 60-75); the results show that the older subjects were slower than the young subjects, but more accurate.
Abstract: The effects of aging on response time (RT) are examined in 2 lexical-decision experiments with young and older subjects (age 60-75). The results show that the older subjects were slower than the young subjects, but more accurate. R. Ratcliff s (1978) diffusion model provided a good account of RTs, their distributions, and response accuracy. The fits show an 80-100-ms slowing of the nondecision components of RT for older subjects relative to young subjects and more conservative decision criterion settings for older subjects than for young subjects. The rates of accumulation of evidence were not significantly different for older compared with young subjects (less than 2% and 5% higher for older subjects relative to young subjects in the 2 experiments).

253 citations


Journal ArticleDOI
TL;DR: A set of techniques and algorithms are presented that provide automatic discovery of firewall policy anomalies to reveal rule conflicts and potential problems in legacy firewalls, and anomaly-free policy editing for rule insertion, removal, and modification.
Abstract: Firewalls are core elements in network security. However, managing firewall rules, especially for enterprise networks, has become complex and error-prone. Firewall filtering rules have to be carefully written and organized in order to correctly implement the security policy. In addition, inserting or modifying a filtering rule requires thorough analysis of the relationship between this rule and other rules in order to determine the proper order of this rule and commit the updates. In this paper we present a set of techniques and algorithms that provide automatic discovery of firewall policy anomalies to reveal rule conflicts and potential problems in legacy firewalls, and anomaly-free policy editing for rule insertion, removal, and modification. This is implemented in a user-friendly tool called ?Firewall Policy Advisor.? The Firewall Policy Advisor significantly simplifies the management of any generic firewall policy written as filtering rules, while minimizing network vulnerability due to firewall rule misconfiguration.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated two potentially relevant antecedents to performing transformational leadership behavior: cynicism about organizational change (CAOC) and the leader's social context, specifically peer leadership behavior.
Abstract: Although transformational leadership behavior (TLB) has been linked to a number of positive organizational outcomes, research regarding the antecedents of such behavior is limited. Guided by Ajzen and Fishbein's theory of reasoned action [ Psychological Bulletin 84 (1977) 888], we investigated two potentially relevant antecedents to performing TLB: cynicism about organizational change (CAOC) and the leader's social context—specifically peer leadership behavior. We hypothesized that CAOC would negatively predict TLB, while peer leadership behavior would positively predict TLB. Further, we expected that peer leadership behavior would have a positive moderating effect on leader CAOC. Data were gathered from 227 managers from multiple organizations and their 2247 subordinates. Findings supported the proposed hypotheses. Cynicism and peer leadership behavior explained nearly one quarter (24%) of the variance in TLB. Further, it appears that both CAOC and TLB may be malleable in organizational contexts. Implications for leadership research and practice are discussed.

Journal ArticleDOI
Ray W. Coye1
TL;DR: In this article, the authors propose a model that outlines the process through which expectations operate at the point of delivery and provide a framework for future investigations, focusing on service provider behavior and control of cues that may influence consumer expectations once they have entered the delivery system.
Abstract: Consumers of services have expectations about what they will receive from the delivery system. These expectations are beliefs about future events which, when compared with the perceived actual service delivered, are presumed to influence satisfaction and assessments of overall service quality. The purpose of this paper is to propose a model that outlines the process through which expectations operate at the point of delivery and provide a framework for future investigations. Implications for management practice focus on service provider behavior at the point of delivery and on control of cues that may influence consumer expectations once they have entered the delivery system. Directions for research include verification of model relationships and identification of specific types of cues that relate to attributes commonly considered in consumers’ judgements of service quality.

Journal ArticleDOI
TL;DR: In this article, the authors examined the association of narcissistic features with aggression and internalizing symptoms in 233 students of 5th-8th grade at three inner-city schools and found that narcissistic exploitativeness positively predicted self-reported proactive aggression, and narcissistic exhibitionism positively predicted internalising symptoms.
Abstract: Recent research and theory suggest narcissistic features contribute to aggression in adults. The present study examined the association of narcissistic features with aggression and internalizing symptoms in 233 students of 5th‐8th grade at three inner-city schools. A factor analysis of the Narcissistic Personality Inventory in this sample revealed three factors: Adaptive Narcissism, Exploitativeness, and Exhibitionism. Regression analyses were used to predict the association of these three narcissistic features with self-, teacher-, and peer-reported aggression and self-reported internalizing symptoms. Results indicate narcissistic exploitativeness positively predicted self-reported proactive aggression, and narcissistic exhibitionism positively predicted internalizing symptoms. Narcissism and self-esteem interacted to predict teacher-reported aggression and self-reported internalizing symptoms. Results are discussed in the context of existing theories of narcissism, threatened egotism, and self-perception bias.

Proceedings ArticleDOI
22 Aug 2004
TL;DR: A unified framework for the discovery and analysis of Web navigational patterns based on Probabilistic Latent Semantic Analysis is developed and the flexibility of this framework is shown in characterizing various relationships among users and Web objects.
Abstract: The primary goal of Web usage mining is the discovery of patterns in the navigational behavior of Web users. Standard approaches, such as clustering of user sessions and discovering association rules or frequent navigational paths, do not generally provide the ability to automatically characterize or quantify the unobservable factors that lead to common navigational patterns. It is, therefore, necessary to develop techniques that can automatically discover hidden semantic relationships among users as well as between users and Web objects. Probabilistic Latent Semantic Analysis (PLSA) is particularly useful in this context, since it can uncover latent semantic associations among users and pages based on the co-occurrence patterns of these pages in user sessions. In this paper, we develop a unified framework for the discovery and analysis of Web navigational patterns based on PLSA. We show the flexibility of this framework in characterizing various relationships among users and Web objects. Since these relationships are measured in terms of probabilities, we are able to use probabilistic inference to perform a variety of analysis tasks such as user segmentation, page classification, as well as predictive tasks such as collaborative recommendations. We demonstrate the effectiveness of our approach through experiments performed on real-world data sets.

Journal ArticleDOI
TL;DR: In this article, the authors show that implausible external explanations for poor performance lead analysts to provide lower earnings forecasts and assess a higher cost of capital than if the explanation had not been provided.
Abstract: Managers often provide self-serving disclosures that blame poor financial performance on temporary, external factors. Results of an experiment conducted with 124 financial analysts suggest that when analysts perceive such disclosures as plausible, they provide higher earnings forecasts and stock valuations than if the explanation had not been provided. However, we also show that these disclosures can backfire if analysts find them implausible. Specifically, implausible external explanations for poor performance lead analysts to provide lower earnings forecasts and assess a higher cost of capital than if the explanation had not been provided.

Journal ArticleDOI
TL;DR: General background regarding the empirical research needs and concerns regarding LGB people of color are provided and the articles included in this special issue are introduced.
Abstract: Lesbian, gay, and bisexual (LGB) people of color may experience multiple layers of oppression, as they often not only contend with the negative societal reactions to their sexual orientation but also may experience racial prejudice, limited economic resources, and limited acceptance within their own cultural community. Despite the range of psychosocial issues that may be encountered by this population, and the need to understand factors that promote resiliency and well-being, the empirical psychological literature has virtually ignored LGB people of color. This article provides general background regarding the empirical research needs and concerns regarding LGB people of color and introduces the articles included in this special issue. Recommendations for increasing research with LGB people of color are offered.

Journal ArticleDOI
TL;DR: This paper looks into the key infrastructure factors affecting the success of small companies in developing economies that are establishing B2B e-commerce ventures and reveals that workers' skills, client interface, and technical infrastructure are the most important factors to the success.
Abstract: This paper looks into the key infrastructure factors affecting the success of small companies in developing economies that are establishing B2B e-commerce ventures. The factors were identified through a literature review and a pilot study carried out in two organizations. The results of the pilot study and literature review reveal five factors that contribute to the success of B2B e-commerce. These factors were later assessed for importance using a survey. The outcome of our analysis reveals that workers' skills, client interface, and technical infrastructure are the most important factors to the success of a B2B e-commerce relationship.

Journal ArticleDOI
TL;DR: A general sampling procedure to quantify model mimicry, defined as the ability of a model to account for data generated by a competing model, is presented and application of both the data informed and the data uninformed PBCM is illustrated.

Journal ArticleDOI
TL;DR: This article found that three company characteristics (market value of equity, book-to-market ratio, and dividend yield-capture style-related trends in equity returns) correlate well with stock market performance.

Journal ArticleDOI
TL;DR: The cost associated with entering online bids and the uncertainty about future entry--both of which distinguish Internet from live auctions--can explain this behavior, and a simple theoretical model is presented that derives the conditions under which jump bidding arises in a format commonly used for online trading, the ascending-price auction.
Abstract: Abidding strategy commonly observed in Internet auctions is that of "jump bidding," or entering a bid larger than what is necessary to be a currently winning bidder. In this paper, we argue that the cost associated with entering online bids and the uncertainty about future entry--both of which distinguish Internet from live auctions--can explain this behavior. We present a simple theoretical model that includes the preceding characteristics, and derive the conditions under which jump bidding arises in a format commonly used for online trading, the ascending-price auction. We also present evidence, recorded from hundreds of Internet auctions, that is consistent with some of the basic predictions from our model. We find that jump bidding is more likely earlier in an auction, when jumping has a larger strategic value, and that the incentives to jump bid increase as competition increases. Our results also indicate that jump bidding is effective: Jump bidders place fewer bids overall, and increased early jump bidding deters entry later in the auction. We also discuss possible means of reducing bidding costs and evidence that Internet auctioneers are pursuing this goal.

Proceedings ArticleDOI
13 Jun 2004
TL;DR: It is proved that the NP-hard distinguishing substring selection problem has no polynomial time approximation schemes of running time f(1/ε)no(1 /ε) for any function f unless an unlikely collapse occurs in parameterized complexity theory.
Abstract: We develop new techniques for deriving very strong computational lower bounds for a class of well-known NP-hard problems, including weighted satisfiability, dominating set, hitting set, set cover, clique, and independent set. For example, although a trivial enumeration can easily test in time O(nk) if a given graph of n vertices has a clique of size k, we prove that unless an unlikely collapse occurs in parameterized complexity theory, the problem is not solvable in time f(k) no(k) for any function f, even if we restrict the parameter value k to be bounded by an arbitrarily small function of n. Under the same assumption, we prove that even if we restrict the parameter values k to be Θ(μ(n)) for any reasonable function μ, no algorithm of running time no(k) can test if a graph of n vertices has a clique of size k. Similar strong lower bounds are also derived for other problems in the above class. Our techniques can be extended to derive computational lower bounds on approximation algorithms for NP-hard optimization problems. For example, we prove that the NP-hard distinguishing substring selection problem, for which a polynomial time approximation scheme has been recently developed, has no polynomial time approximation schemes of running time f(1/e)no(1/e) for any function f unless an unlikely collapse occurs in parameterized complexity theory.

BookDOI
31 Jul 2004
TL;DR: Byrsk and Shafir as discussed by the authors proposed a globalization and the Citizenship Gap framework for countries in an era of globalization, which is based on the Latitudes of Citizenship (LOC) model.
Abstract: Part 1: Framework 1 Globalization and the Citizenship Gap Alison Brysk and Gershon Shafir 2 Citizenship and Human Rights In An Era of Globalization Gershon Shafir Part 2: Producing Citizenship 3 Constituting Political Community Ronnie Lipschutz 4 Latitudes of Citizenship Aihwa Ong Part 3: Constructing Rights 5 Agency on a Global Scale: Rules, rights and the European Union David Jacobson and Galya Benarieh Ruffer 6 International Law and Citizenship: Mandated membership, diluted identity Peter Spiro Part 4: Globalizing the Citizenship Gap 7 Deflated Citizenship: Labor rights in a global era Gay W Seidman 8 The Globalization of Social Reproduction: Women migrants Kristen Hill Maher 9 Children Across Borders: Patrimony, property or persons? Alison Brysk Part 5: Reconstructing Citizenship 10 Citizenship and Globalism: Markets, empire and terrorism Richard Falk 11 The Repositioning of Citizenship Saskia Sassen 12 Globalizing Citizenship? Alison Brysk and Gershon Shafir

Journal ArticleDOI
TL;DR: In this paper, the behavior of fractional integral operators associated to a measure on a metric space satisfying a mild growth condition is investigated, namely that the measure of each ball is controlled by a fixed power of its radius.
Abstract: The main purpose of this paper is to investigate the behaviour of fractional integral operators associated to a measure on a metric space satisfying just a mild growth condition, namely that the measure of each ball is controlled by a fixed power of its radius. This allows, in particular, non-doubling measures. It turns out that this condition is enough to build up a theory that contains the classical results based upon the Lebesgue measure on euclidean space and their known extensions for doubling measures. We start by analyzing the images of the Lebesgue spaces associated to the measure. The Lipschitz spaces, defined in terms of the metric, play a basic role too. For a euclidean space equipped with one of these measures, we also consider the so-called "regular"BMO space introduced by X. Tolsa. We show that it contains the image of a Lebesgue space in the appropriate limit case and also that the image of the space "regular"BMO is contained in the adequate Lipschitz space.

Journal ArticleDOI
TL;DR: A computational model of information navigation that simulates users navigating through a Web site finds that the optimal structure depends on the quality of the link labels and is able to account for the results in the previous studies.
Abstract: Previous studies for menu and Web search tasks have suggested differing advice on the optimal number of selections per page. In this article, we examine this discrepancy through the use of a computational model of information navigation that simulates users navigating through a Web site. By varying the quality of the link labels in our simulations, we find that the optimal structure depends on the quality of the labels and are thus able to account for the results in the previous studies. We present additional empirical results to further validate the model and corroborate our findings. Finally we discuss our findings' implications for the information architecture of Web sites.

Journal ArticleDOI
TL;DR: In this paper, a multi-element studies of sediments have successfully identified specific activity areas by analysing other elements in addition to phosphorus, which can be used to identify specific activities.
Abstract: Archaeologists have employed sediment chemistry in site prospection for nearly a century. For example, phosphorus is a good indicator of human occupation, because it is a generic indicator of human activity. Recently, multi-element studies of sediments have successfully identified specific activity areas by analysing other elements in addition to phosphorus. To reach its full potential, however, sediment chemistry must be undertaken with an understanding of how these residues are formed and of the chemical indicators that can be used to identify specific activities. Methodologies that optimize the extraction of specific residues must be employed. Not to do so is a naive application of the technique.


Journal ArticleDOI
TL;DR: The incremental funding method is a financially informed approach to software development that maximizes returns by delivering functionality in "chunks" of customer-valued features, carefully sequenced to optimize the project's net present value (NPV).
Abstract: Software development projects don't get funded unless they return clearly defined value to the business. Demands for shorter investment periods, faster time-to-market, and increased agility require new, radical software development approaches. These approaches must draw on the expertise of both software architects and financial stakeholders and open the traditional black box of software development to rigorous financial analysis. We can accomplish this only by positioning software development as a value-creation activity in which business analysis is integral. The incremental funding method is a financially informed approach to software development. IFM maximizes returns by delivering functionality in "chunks" of customer-valued features, carefully sequenced to optimize the project's net present value (NPV). We derived the IFM concepts from several years' experience in winning competitive contracts for large-systems integration and application development projects.

Journal ArticleDOI
TL;DR: It is shown that the conclusions from the normal approximation are flawed, especially in the range of service levels where most companies operate, and for firms operating at these service levels, decreasing lead time is the right lever if they want to cut inventories, not reducing lead time variability.
Abstract: The pressure to reduce inventory investments in supply chains has increased as competition expands and product variety grows. Managers are looking for areas they can improve to reduce inventories without hurting the level of service provided. Two areas that managers focus on are the reduction of the replenishment lead time from suppliers and the variability of this lead time. The normal approximation of lead time demand distribution indicates that both actions reduce inventories for cycle service levels above 50%. The normal approximation also indicates that reducing lead time variability tends to have a greater impact than reducing lead times, especially when lead time variability is large. We build on the work of Eppen and Martin (1988) to show that the conclusions from the normal approximation are flawed, especially in the range of service levels where most companies operate. We show the existence of a service-level threshold greater than 50% below which reorder points increase with a decrease in lead time variability. Thus, for a firm operating just below this threshold, reducing lead times decreases reorder points, whereas reducing lead time variability increases reorder points. For firms operating at these service levels, decreasing lead time is the right lever if they want to cut inventories, not reducing lead time variability.

Journal ArticleDOI
01 Mar 2004
TL;DR: This work presents a new approach that identifies the location of a type error as a set of program points (a slice) all of which are necessary for the type error.
Abstract: Previous methods have generally identified the location of a type error as a particular program point or the program subtree rooted at that point. We present a new approach that identifies the location of a type error as a set of program points (a slice) all of which are necessary for the type error. We identify the criteria of completeness and minimality for type error slices. We discuss the advantages of complete and minimal type error slices over previous methods of presenting type errors. We present and prove the correctness of algorithms for finding complete and minimal type error slices for implicitly typed higher-order languages like Standard ML.