M
Michael A. Campion
Researcher at Purdue University
Publications - 144
Citations - 18938
Michael A. Campion is an academic researcher from Purdue University. The author has contributed to research in topics: Job design & Job performance. The author has an hindex of 60, co-authored 141 publications receiving 17570 citations. Previous affiliations of Michael A. Campion include North Carolina State University & Saint Petersburg State University.
Papers
More filters
Journal ArticleDOI
Applicant Reactions to Different Selection Technology: Face-to-Face, Interactive Voice Response, and Computer-Assisted Telephone Screening Interviews
TL;DR: In this article, a sample of students experienced one of three types of screening techniques, face-to-face interview screening, telephone interview screenings, and interactive voice response (IVR) screenings, with identical content in a pre- to post-screening longitudinal study.
Journal ArticleDOI
Development and Test of a Task Level Model of Motivational Job Design
Chi Sum Wong,Michael A. Campion +1 more
TL;DR: In this paper, the authors predict the motivational value of jobs from task, task interdependence, and task similarity, and find that the task similarity does not correlate well with job satisfaction.
Journal ArticleDOI
Understanding reactions to job redesign: A quasi-experimental investigation of the moderating effects of organizational context on perceptions of performance behavior.
TL;DR: In this paper, a longitudinal quasi-experimental study showed that although such a redesign had positive effects on three performance behaviors (effort, skill usage, and problem solving), its effectiveness also depended on aspects of the organizational context.
Journal ArticleDOI
Coming Full Circle: Using Research and Practice to Address 27 Questions About 360-Degree Feedback Programs.
TL;DR: The research evidence addressing practical issues faced when implementing a 360-degree feedback system is reviewed in this paper, where 27 specific questions that often arise in the development, implementation, administration, and interpretation of multisource feedback programs are addressed.
Journal ArticleDOI
Initial investigation into computer scoring of candidate essays for personnel selection.
TL;DR: It is suggested that it may provide a cost-effective means of using predictors that have comparable validity but have previously been too expensive for large-scale screening, and the potential implications of using computer scoring to address the adverse impact-validity dilemma.