scispace - formally typeset
Open AccessJournal ArticleDOI

Supporting hypothesis generation by learners exploring an interactive computer simulation

Reads0
Chats0
TLDR
It was found that support offered for identifying variables, in the form of a selection list, is relatively successful: students who used this list were better in differentiating different types of variables.
Abstract
Computer simulations provide environments enabling exploratory learning. Research has shown that these types of learning environments are promising applications of computer assisted learning but also that they introduce complex learning settings, involving a large number of learning processes. This article reports on an instrument for supporting one of these learning processes: stating hypotheses. The resulting instrument, an hypothesis scratchpad, was designed on the basis of a conceptual representation of the simulation model and tested in an experimental study. In this study three versions of the scratchpad, varying in structure, were compared. It was found that support offered for identifying variables, in the form of a selection list, is relatively successful: students who used this list were better in differentiating different types of variables. For identifying relations, a selection list of relations offered to the students proved unhelpful in finding accurate relations: students using this list stated their hypotheses mainly at a very global level. The research reported was conducted in the project SIMULATE. SIMULATE was part of SAFE, a R&D project partially funded by the CEC under contract D1014 within the Exploratory Action of the DELTA programme. The work of SIMULATE is continued in the DELTA main phase project SMISLE.

read more

Content maybe subject to copyright    Report

Supporting hypothesis information by learners exploring an
interactive computer simulation
Citation for published version (APA):
Joolingen, van, W. R., & Jong, de, A. J. M. (1992). Supporting hypothesis information by learners exploring an
interactive computer simulation.
Instructional Science
,
20
, 389-404. https://doi.org/10.1007/BF00116355
DOI:
10.1007/BF00116355
Document status and date:
Published: 01/01/1992
Document Version:
Publisher’s PDF, also known as Version of Record (includes final page, issue and volume numbers)
Please check the document version of this publication:
• A submitted manuscript is the version of the article upon submission and before peer-review. There can be
important differences between the submitted version and the official published version of record. People
interested in the research are advised to contact the author for the final version of the publication, or visit the
DOI to the publisher's website.
• The final author version and the galley proof are versions of the publication after peer review.
• The final published version features the final layout of the paper including the volume, issue and page
numbers.
Link to publication
General rights
Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners
and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights.
• Users may download and print one copy of any publication from the public portal for the purpose of private study or research.
• You may not further distribute the material or use it for any profit-making activity or commercial gain
• You may freely distribute the URL identifying the publication in the public portal.
If the publication is distributed under the terms of Article 25fa of the Dutch Copyright Act, indicated by the “Taverne” license above, please
follow below link for the End User Agreement:
www.tue.nl/taverne
Take down policy
If you believe that this document breaches copyright please contact us at:
openaccess@tue.nl
providing details and we will investigate your claim.
Download date: 09. Aug. 2022

Instructional Science 20: 389404 (1991)
0 Kluwer Academic Publishers, Dordrecht - Printed in the Netherlands
389
Supporting hypothesis generation by learners exploring an
interactive computer simulation*
WOUTER R. VAN JOOLINGEN’ & TON DE JONGZ
lEindhoven University of Technology, Department of Philosophy and Social Sciences, P.O. Box 513,
NL-5600 MB Eindhoven, The NetherlanA
2University of Twente, Deparmtent of Education, Division of Instructional Technology, P.O. Box 217,
NL-7500 AE Ens&de, The Netherlands
Abstract. Computer simulations provide environments enabling exploratory learning. Research has
shown that these types of learning environments are promising applications of computer assisted learning
but also that they introduce complex learning settings, involving alarge number of learning processes. This
article reports on an instrument for supporting one of these learning processes: stating hypotheses.
lhe resulting instrument, an hypothesis scratchpad, was designed on the basis of a conceptual
representation of the simulation model and tested in an experimental study. In this study three versions
of the scratchpad, varying in structure, were compared. It was found that support offered for identifying
variables, in the form of a selection list, is relatively successful: students who used this list were better
in differentiating different types of variables. For identifying relations, a selection list of relations offered
to the students proved unhelpful in finding accurate relations: students using this list stated their
hypotheses mainly at a very global level.
Introduction: supporting exploratory learning with simulations
Learning with computer simulations is a promising application of instructional
technology. A main reason is that computer simulations enable the creation of
relatively cheap, safe and well-accessible exploratory learning environments
(Alessi and Trollip, 1987; De Jong, 1991; Reigeluth and Schwartz, 1989). These
types of environments provide complex learning settings involving a large number
of specific learning processes. In a recent study Njoo and De Jong (1991; see also
De Jong and Njoo, 1990) made observations of students working with a computer
simulation. They distinguished the following main categories of learning processes
(apart from regulative processes, concerned with planning, and processes involved
with operation of the simulation system):
- analysis
-
hypothesis generation
- hypothesis testing
- evaluation
* The research reported was conducted in the project SIMULATE. SIMULATE was part of SAFE, a
R&D project partially funded by the CEC under contract D1014 within the Exploratory Action of the
DELTA programme. The work of SIMULATE is continued in the DELTA main phase project
SMISLE.

390
The process of analysis is concerned with identifying variables and global model
properties. In this phase the first, often not yet well-articulated, ideas about the
underlying simulation model may arise, leading to the generation of hypotheses
about the simulation. Hypotheses must be tested to become a part of the learner’s
(mental) model of the simulation. This testing includes the design of an experiment
which will be performed with the simulation, predicting the outcomes of the
experiment, on the basis of the hypothesis, perjkming the experiment and
interpreting the results (Njoo and de Jong, 1992). This may lead to rejection of or
support for the hypothesis and may give rise to the generation of a new hypothesis
or a reformulation of an old one. Then the process may start over again. Also the
learner can choose to investigate another part of the simulation model and state an
hypothesis about that part. This process can continue until all parts of the
simulation have been investigated and the learner has discovered the complete
model. Research has shown that generation of hypotheses and designing
experiments to test these are both important and problematic parts of discovery
learning (Gorman and Gorman, 1984; Gorman, Stafford and Gorman, 1987; Klahr
and Dunbar 1988; Mynatt, Doherty and Tweney, 1977, 1978; Njoo and de Jong,
1991; Wason, 1960).
Hypothesis generation and testing
Klahr and Dunbar (1988; Dunbar and Klahr, 1989; Shrager and Klahr, 1986)
studied the formation of hypotheses and the design of experiments to test these,
with students discovering the operation of a simple device. Their research results in
a theory of scientific discovery as dual search (SDDS), partially based on general
theories of problem solving (Newell and Simon, 1972; Green0 and Simon, 1988).
They propose it as “a general model of scientific reasoning, that can be applied to
any context in which hypotheses are proposed and data is collected” (p. 32). SDDS
describes the scientific discovery process as a search process in an hypothesis
space, containing all possible hypotheses about the system under study, and in an
experiment space, consisting of all experiments that can be carried out with the
system.
Klahr and Dunbar’s findings indicate that there are two types of strategies for
searching these spaces (see also Green0 and Simon, 1988, p. 640). The first is a
bottom up strategy (used by what they call experimenters) consisting of a first phase
in which an hypothesis is tested, followed by a phase where the subject searches the
experiment space without explicitly stating hypotheses. The main characteristic of
experimenters is that they perform experiments which rule out all other possible
hypotheses before they actually state the correct hypothesis. In other words,
experimenters cannot reach certain parts of the hypothesis space without prior
experimental validation.
In the second, top-down strategy (used by so-called theorists) experiments are
never performed without the prior statement of an hypothesis. Typically, a theorist
states an hypothesis before carrying out an experiment and switches to a new

391
hypothesis only after sufficient contradicting evidence has been found. The new
hypothesis stated will mostly not differ radically from the old one. Typically, only
one relevant aspect will have been changed. Theorists do not need to conduct a
critical experiment before the correct hypothesis is stated. Generally, theorists
require fewer experiments than do experimenters to reach the correct conclusion.
Similar distinctions are reported by Shute, Glaser, and Raghavan (1989) (see also
Shute and Glaser, 1986,199O; Shute, Glaser and Resnick, 1986). They have studied
the use of a system for learning laws of economics, Smithtown. The system is a
freely explorable computer simulation of a model of an economy. Students are
invited to explore the simulation to discover the laws that determine the underlying
model. Shute and others refer to theorists’ behaviour as hypothesis driven and to
experimenter behaviour as data driven. Moreover, they conclude that hypothesis
driven subjects are more successful than data-driven subjects.
A study by Wason (1960) and two related studies by Gorman et al. (Gorman and
Gorman 1984; Gorman, Stafford and Gorman, 1987) showed that students, once
they have formed an hypothesis (in their case in a simple domain: discovering
regularity in sequences of three numbers), tend to seek confirming evidence for this
hypothesis, i.e. they design experiments which are aimed at obtaining data in
support of the hypothesis and not at obtaining data, able to discriminate between
hypotheses. This may result in long series of fruitless experiments. When the
hypothesis space is reduced to a small set of conflicting hypotheses, by offering a
small set of rules to the students from which they could choose, the search for the
right rule proved to be far more successful.
Recent research by Njoo and De Jong (1992) showed that the formation of
hypotheses about a simulation is one of the most problematic parts of the
exploratory learning setting. Their research shows that students spontaneously
generated very few hypotheses and that they confuse hypotheses and predictions.
From the research described above we may conclude that hypothesis generation
is a problematic issue in discovery learning contexts, notably simulations, leading
to a search for ways to support this study process. Based on SDDS (Klahr and
Dunbar 1988, 1989) we can offer this support by elucidating the structure of
hypothesis space to the learner. In the present study this is done by providing
learners with a mock-up hypoth,esis scratchpad, a software instrument, or learner
instrument, on which the student can note down hypotheses.
Hypothesis scratchpads for simulations
Hypothesis scratchpads offer learners a structured overview of hypothesis space.
This implies that for designing these scratchpads we first need to investigate the
specific properties of hypothesis space for simulations.
In Van Joolingen and De Jong (1991b), it is argued that hypotheses about
simulations are formed in principle about a conceptual model describing the
properties of the simulation model, that is: An hypothesis about a simulation model
is a statement that a certain generic relation holds between two or more conceptual

392
variables (see also Reimann, 1989). The term conceptual variable stands for a
genemlisation of the variables present in the computer simulation; the term generic
relation is a generalisation of the traditional relation concept, allowing for fuzzy
and incomplete descriptions of a certain relationship (Van Joolingen and De Jong,
1991a; 1992).
The definition above determines the dimensions of the hypothesis space, being a
superposition of the space of all possible combinations of conceptual variables and
the space of all possible relations between these variables. This implies that the
process of forming hypotheses consists of the following subprocesses:
- identifying variables
- selecting variables
- defining the (generic) relation that is hypothesized to hold between the
selected variables.
In the literature, several learner instruments supporting one or more of these
subprocesses of hypothesis generation are described. Smithtown (Shute and Glaser,
1990) contains a so-called Hypothesis Menu, that supports students in stating
hypotheses about the model. The hypothesis menu offers a structured framework
for entering hypotheses. The most important elements are the Objects and Verbs.
The objects correspond to variables in the simulation and verbs express the
behaviour of the objects under conditions, expressed in the same hypothesis. The
other two elements in the hypothesis menu are connectors and the direct object
menu which are used respectively to produce well formed sentences and to specify
the hypothesis more precisely. A sample hypothesis entered in the hypothesis menu
could be: “As price increases then quantity demanded decreases” (Shute and
Glaser, 1990, p. 292).
Michael, Haque, Rovick and Evens (1989) use an hypothesis menu in a learning
environment for pathophysiology problems. The goal that the learner is to achieve
is to locate a malfunction in a patient on the basis of given symptoms. The
hypotheses that can be entered take the form of an area of the model where the
defect may be located. The learner may select his/her hypotheses from a menu of
ready-made hypotheses. The system offers the learner a set of nested menus to
select from. After the hypothesis has been chosen the learner can collect data to
support his/her hypothesis.
In general we can define an hypothesis scratchpad as a learner instrument that
can support some or all of the subprocesses of forming hypotheses mentioned
above by offering the elements needed for hypotheses (variables and relations) and
by adding some structure to these elements.
The scratchpad may not be imperative in the sense that it actually forces students
to generate a particular hypothesis and/or to carry out one specific experiment.
Furthermore, it may not give away too much information, such as providing ready
made hypotheses, since the actual generation of hypotheses is one of the goals of
the self-discovery process.

Citations
More filters
Journal ArticleDOI

Scientific Discovery Learning with Computer Simulations of Conceptual Domains

TL;DR: A review of the observed effectiveness and efficiency of discovev learning in simulation environments together with problems that learners may encounter in discovery learning, and how simulations may be combined with instructional support in order to overcome these problems is given in this article.
Journal ArticleDOI

Collaborative Inquiry Learning: Models, tools, and challenges

TL;DR: In this article, the authors review some prominent models of inquiry learning and compare them with a set of inquiry processes that are the basis for cooperation in the scientific network NetCoIL.
Book

Games And Simulations in Online Learning: Research and Development Frameworks

David Gibson
TL;DR: Though video games are everywhere, they are absent from the classroom and computer-based simulations, a form of computer games, have begun to appear, but they are not as wide-spread as email, discussion threads, and blogs.
Journal ArticleDOI

The Active Integration of Information during Learning with Dynamic and Interactive Visualisations

TL;DR: Support measures that encouraged learners to actively integrate symbolic and pictorial representations and to interact with dynamic Pictorial representations in a structured and reflective way revealed that the active integration of different representations improved learning significantly and that the structured interaction with different representations specifically increased verbal understanding.
Journal ArticleDOI

Innovations in STEM education: the Go-Lab federation of online labs

TL;DR: Go-Lab enables inquiry-based learning that promotes acquisition of deep conceptual domain knowledge and inquiry skills, with the further intent of interesting students in careers in science.
References
More filters
Book

Human Problem Solving

TL;DR: The aim of the book is to advance the understanding of how humans think by putting forth a theory of human problem solving, along with a body of empirical evidence that permits assessment of the theory.
Journal ArticleDOI

Human Problem Solving.

Journal ArticleDOI

On the failure to eliminate hypotheses in a conceptual task.

TL;DR: The results showed that those subjects, who reached two or more incorrect conclusions, were unable, or unwilling to test their hypotheses, and the implications are discussed in relation to scientific thinking.
Journal ArticleDOI

Dual space search during scientific reasoning

TL;DR: A general model of Scientific Discovery as Dual Search (SDDS) is presented that shows how search in two problem spaces (an hypothesis space and an experiment space) shapes hypothesis generation, experimental design, and the evaluation of hypotheses.
Reference BookDOI

Stevens' handbook of experimental psychology

Hal Pashler
TL;DR: Vol. 1 1. Neural Basis of Vision, 2. Associative Structures in Pavlovian and Instrumental Conditioning, 3. Reinforcement Learning, and 4. Learning: Laws and Models of Basic Conditioning.
Related Papers (5)
Frequently Asked Questions (1)
Q1. What contributions have the authors mentioned in the paper "Supporting hypothesis information by learners exploring an interactive computer simulation" ?

This article reports on an instrument for supporting one of these learning processes: stating hypotheses. Lhe resulting instrument, an hypothesis scratchpad, was designed on the basis of a conceptual representation of the simulation model and tested in an experimental study. In this study three versions of the scratchpad, varying in structure, were compared. In a recent study Njoo and De Jong ( 1991 ; see also De Jong and Njoo, 1990 ) made observations of students working with a computer simulation. They distinguished the following main categories of learning processes ( apart from regulative processes, concerned with planning, and processes involved with operation of the simulation system ): analysis hypothesis generation hypothesis testing evaluation * The research reported was conducted in the project SIMULATE. The work of SIMULATE is continued in the DELTA main phase project SMISLE. Introduction: supporting exploratory learning with simulations Learning with computer simulations is a promising application of instructional technology.