scispace - formally typeset
Search or ask a question
JournalISSN: 1035-719X

Evaluation of Journal of Australasia 

SAGE Publishing
About: Evaluation of Journal of Australasia is an academic journal published by SAGE Publishing. The journal publishes majorly in the area(s): Project commissioning & Government. It has an ISSN identifier of 1035-719X. Over the lifetime, 383 publications have been published receiving 15044 citations. The journal is also known as: EJA.


Papers
More filters
Journal ArticleDOI
TL;DR: Patton as discussed by the authors suggested that if one had to choose between implementation information and outcomes information because of limited evaluation resoures, there are many instances in which implementation information would be of greater value.
Abstract: ‘In Utilization-Focused Evaluation (Patton, 1978) I suggested that if one had to choose between implementation information and outcomes information because of limited evaluation resoures, there are many instances in which implementation information would be of greater value. A decision maker can use implementation information to make sure that a policy is being put into operation according to design – or to test the feasibility of the policy. Unless one knows that a program is operating according to design, there may be little reason to expect it to produce the desired outcomes. Furthermore, until the program is implemented and a ‘treatment’ is believed to be in operation, there may be little reason even to bother evaluating outcomes. Where outcomes are evaluated without knowledge of implementation, the resuts seldom provide a direction for action because the decision maker lacks information about what produced the observed outcomes (or lack of outcomes). ... It is important to study and evaluate program implementation in order to understand how and why programs deviate from initial plans and expectations. Such deviations are quite common and natural ...’ (Patton, 1980, p 69; 1990, p. 105; Patton, 2002, p. 161)

12,369 citations

Journal ArticleDOI
TL;DR: In this article, the authors present guidance for evaluating one of the programs of an organization and ask a staff member or hire someone outside the organisation to evaluate the program, but surprisingly little guidance is available for this task.
Abstract: An organisation wishes to evaluate one of its programs. It can ask a staff member or hire someone outside the organisation. Which should it choose?Surprisingly little guidance is available for this...

104 citations

Journal ArticleDOI
TL;DR: The use of photos as stimuli for talking about health settings before presenting three recent case studies where photo-interviewing has been used successfully in health evaluation and research are presented.
Abstract: This article reviews the use of photographs as data within the social sciences as well as defining related terminology used over the past century. It then examines the use of photos as stimuli for ...

90 citations

Journal ArticleDOI
TL;DR: The fact-value dichotomy, an unresolved issue central to evaluation, was discussed by House and Howe in this paper, where the authors look at how this issue is addressed in philosophy currently and apply these insights to evaluation theory.
Abstract: This book is by an evaluation theorist, Ernie House, and a philosopher, Kenneth Howe. The aspiration of the authors is to reconcile evaluation theory with general currents in contemporary philosophy. Many important insights of the past 20 years of philosophy remain untapped. The focus of the book is on the infamous fact-value dichotomy, an unresolved issue central to evaluation. The authors look at how this issue is addressed in philosophy currently and apply these insights to evaluation theory, as well as adding their own original analysis to the topic. The book is on evaluation theory rather than on practice, even though the authors do suggest implications for practice. As the authors indicate, theory can inform practice and practice can inform theory. Although the attention of the authors is centred on evaluation, they believe their insights apply to educational and social research as well, because they are haunted by similar value issues. Evaluation and social research often blend together, but they can be distinguished from each other. Evaluation arrives at conclusions such as ‘X is good’, whereas social research arrives at conclusions such as ‘X causes Y’ or ‘X is a case of Y’. Where Y can be demonstrated as something worthwhile, the two are similar. However, while the term ‘social research’ is in the title of the book, the focus is on evaluation alone. The book is in three parts. The first part of the book deals with the nature of values and value claims. A legacy of positivism is that there is a strict separation between facts and values. The authors (pp. xv–xvi) state the positivist argument:

66 citations

Journal ArticleDOI
TL;DR: In this paper, a 10-year period of participatory evaluation and participatory action research (PAR) has been studied, and significant sources of rigour identified include: participation and communication methods that develop relations of mutual trust and open communication, using multiple theories and methodologies, multiple sources of data, and multiple methods of data collection.
Abstract: Participatory evaluation and participatory action research (PAR) are increasingly used in community-based programs and initiatives and there is a growing acknowledgement of their value. These methodologies focus more on knowledge generated and constructed through lived experience than through social science (Vanderplaat 1995). The scientific ideal of objectivity is usually rejected in favour of a holistic approach that acknowledges and takes into account the diverse perspectives, values and interpretations of participants and evaluation professionals. However, evaluation rigour need not be lost in this approach. Increasing the rigour and trustworthiness of participatory evaluations and PAR increases the likelihood that results are seen as credible and are used to continually improve programs and policies.----- Drawing on learnings and critical reflections about the use of feminist and participatory forms of evaluation and PAR over a 10-year period, significant sources of rigour identified include:----- • participation and communication methods that develop relations of mutual trust and open communication----- • using multiple theories and methodologies, multiple sources of data, and multiple methods of data collection----- • ongoing meta-evaluation and critical reflection----- • critically assessing the intended and unintended impacts of evaluations, using relevant theoretical models----- • using rigorous data analysis and reporting processes----- • participant reviews of evaluation case studies, impact assessments and reports.

61 citations

Performance
Metrics
No. of papers from the Journal in previous years
YearPapers
202320
202233
202114
202017
201919
201816