SlideShare a Scribd company logo
1 of 18
Download to read offline
Scaffolding Students’ Problem-Solving
Processes in an Ill-Structured Task Using
Question Prompts and Peer Interactions
Xun Ge
Susan M. Land
This study examined the effects of question
prompts and peer interactions in scaffolding
undergraduate students’ problem-solving
processes in an ill-structured task in problem
representation, developing solutions, making
justifications, and monitoring and evaluating.
A quasi-experimental study, supplemented by
multiple-case studies, was conducted to
investigate both the outcomes and the
processes of student problem-solving
performance. The quantitative outcomes
revealed that question prompts had
significantly positive effects on student
problem-solving performance but peer
interactions did not show significant effects.
The qualitative findings, however, did indicate
some positive effects of peer interactions in
facilitating cognitive thinking and
metacognitive skills. The study suggests that
the peer interaction process itself must be
guided and monitored with various strategies,
including question prompts, in order to
maximize its benefits.
Many researchers (e. g., Bransford, Brown, &
Cocking, 2000; Bransford & Stein, 1993; Jonassen,
1997) have emphasized the importance of engag-
ing students in complex, ill-structured problem-
solving tasks, which are intended to help
students see the meaningfulness and relevance of
what they learn and to facilitate transfer by con-
textualizing knowledge in authentic situations.
Yet previous research has pointed to student
deficiencies in problem solving, for instance, a
failure to apply knowledge from one context to
another (Gick, 1986; Gick & Holyoak, 1980), espe-
cially when solving ill-structured problems (Fel-
tovich, Spiro, Coulson, & Feltovich, 1996).
Students’ difficulties in problem solving have
been attributed to both limited domain and
metacognitive knowledge (Brown, 1987).
According to Vygotsky (1978), learners
should be guided or scaffolded by a “more
capable peer” to solve a problem or carry out a
task that would be beyond what they could ac-
complish independently (p. 86). The notion of
scaffolding has traditionally emphasized the
role of dialogue and social interaction to foster
comprehension-monitoring strategies (see e.g.,
reciprocal teaching and peer-regulated learning,
Palincsar & Brown, 1984; Palincsar, Brown, &
Martin, 1987); however, externalized support
during problem solving has also been ac-
complished through strategies such as modeling
(Schoenfeld, 1985), prompting (Scardamalia,
Bereiter, McLean, Swallow, & Woodruff, 1989;
Scardamalia, Bereiter, & Steinbach, 1984), and
guided, student-generated questioning (King,
1991). Such strategies have been found to be ef-
fective in fostering comprehension, monitoring,
ETR&D, Vol. 51, No. 1, 2003, pp. 21–38 ISSN 1042–1629 21
problem solving (e.g., Palincsar & Brown, 1984;
Scardamalia et al., 1989), and reflective thinking
(Lin, Hmelo, Kinzer, & Secules, 1999).
However, previous research has seldom in-
vestigated the effectiveness of those strategies in
supporting ill-structured problem solving, a
process that characterizes the types of complex
problems that we encounter in everyday life. Ill-
structured problems have vaguely defined or
unclear goals (Voss & Post, 1988), and the infor-
mation needed to solve them is not entirely con-
tained in the problem statements (Chi & Glaser,
1985). A problem qualifies as ill defined if any
one of the three components (an initial state,
operators, and a goal state) is not well specified
(Chi & Glaser). We believed that scaffolding
strategies could be adapted to support students’
cognitive and metacognitive skills during ill-
structured problem solving. In this study, we
were specifically interested in examining the ef-
fects of question prompts and peer interactions
to scaffold novice learners’ problem-solving
processes in an ill-structured task.
Theoretical Background
Ill-Structured Problem-solving Processes
Ill-structured problems are defined as having
vague goals (Chi & Glaser, 1985; Voss & Post,
1988) that permit multiple solutions or solution
paths (Kitchner, 1983). By contrast, well-struc-
tured problems have single solutions, optimal
solution paths, and structured goals (Sinnott,
1989). Solving well-structured problems nor-
mally involves representing the problem, sear-
ching for solutions, and implementing solutions
(Gick, 1986). However, because of the nature of
an ill-structured problem, its solution process is
different from that of a well-structured problem.
Problem representation, justification skills,
monitoring, and evaluation are the primary re-
quirements for ill-structured problem solving
(Sinnott; Voss & Post). According to Voss and
Post, problem representation involves examin-
ing concepts and relations of a problem, isolat-
ing the major factors causing the problem and its
constraints, and recognizing divergent perspec-
tives. Once a problem is represented, solutions
can be derived by finding ways to eliminate the
causes of the problem and then developing cor-
responding procedures for implementing them.
Since a problem solver must select a good solu-
tion from among many, he or she must generate
a viable, defensible, and cogent argument to sup-
port the problem solution (Jonassen, 1997; Voss
& Post); thus justification skills are paramount to
solving ill-structured problems (Jonassen;
Kitchner & King, 1981). In addition, the problem
solver must evaluate his or her solution by ex-
amining and defending it against other alterna-
tives. Hence, monitoring and evaluation are
required throughout the process, from identify-
ing the essence of the problem to selecting the
best goal for solving it (Sinnott). Learners
monitor their own processes and movements
from state to state, and select information, solu-
tions, and emotional reactions (Sinnott).
Cognition and Metacognition in Solving
Ill-Structured Problems
Solving ill-structured problems requires
domain-specific knowledge (Voss & Post, 1988;
Voss, Wolfe, Lawrence, & Engle, 1991) as well as
structural knowledge (Jonassen, Beissner, &
Yacci, 1993). Domain-specific knowledge is con-
tent knowledge consisting of cognitive com-
ponents such as propositional information,
concepts, rules, and principles. Structural
knowledge is knowledge of how concepts
within a domain are interrelated and requires
integration of declarative knowledge into useful
knowledge structures (Jonassen et al.). How-
ever, in the absence of domain-specific
knowledge and structural knowledge, metacog-
nition, which involves both knowledge and
regulation of cognition (Pressley & McCormick,
1987), is necessary for solving ill-structured
problems. Chi, Bassok, Lewis, Reimann, and
Glaser (1989) found that successful learners tend
to generate more working explanations, par-
ticularly in response to an awareness of limited
understanding. Wineburg (1998) found that
metacognitive knowledge can compensate for
absence of relevant domain knowledge when
metacognitive awareness leads to recognizing
areas of limited understanding, adopting work-
ing hypotheses, asking questions, monitoring
thinking, and revisiting early interpretations.
22 ETR&D, Vol. 51, No. 1
Question Prompts as a Scaffolding Strategy
Question prompts have been found effective to
help students focus attention and monitor their
learning through elaboration on the questions
asked (Rosenshine, Meister, & Chapman, 1996).
Scardamalia et al. (1984) first used procedural
prompts, such as “An example of this . . .” and
“Another reason that is good. . .,” to scaffold
learners with specific procedures or suggestions
to help them plan their writing. Later, King
(1991, 1992, 1994) provided students with
strategy-questioning prompt cards to teach
them how to make inferences and generaliza-
tions and to ask for and provide task-ap-
propriate elaboration. In one study, King (1991)
specifically emphasized the role of question
prompts in scaffolding metacognition. She
grouped questions into three metacognitive
categories: planning, monitoring, and evalua-
tion, which closely paralleled the general prob-
lem-solving model (problem identification,
searching for a solution, implementation of a
solution, and evaluation, Bransford & Stein,
1993; Gick, 1986). Questions such as “What is the
problem?” and “What do we know about the
problem so far?” were asked to help students
with planning.
Recently, researchers have integrated
prompts into computer-based instruction to
facilitate metacognition (Davis & Linn, 2000;
Hannafin, Land, & Oliver, 1999; Lin & Lehman,
1999). Lin and Lehman found that justification
prompts facilitated transfer to a contextually
dissimilar problem. Similarly, Davis and Linn
found that self-monitoring prompts embedded
in the Web-based knowledge integration en-
vironment (KIE) encouraged students to think
carefully about their activities and facilitated
planning and reflection. Hence, we believed that
question prompts could scaffold ill-structured
problem solving by eliciting thoughtful respon-
ses such as explanations and inferences (King &
Rosenshine, 1993) and constructing cogent argu-
ments (Kitchner & King, 1981).
Peer Interaction as a Scaffolding Strategy
Lin et al. (1999) argued that peer interaction sup-
ports reflective social discourse, thereby helping
learners to consider multiple points of views and
select the best one based on evidence. Previous
research (e.g., King, 1991; Palincsar et al., 1987;
Webb, 1989) indicated that peer interaction
could be an effective scaffolding strategy. Peer
interaction can be guided or unguided. Guided
peer interaction is typically modeled by a
teacher with specific instructions, such as Palinc-
sar and Brown’s (1984) reciprocal teaching, in
which a teacher initially models key activities
such as summarizing, questioning, predicting,
and clarifying, and then both the teacher and the
student take turns leading a dialogue. Addition-
al examples are Palincsar et al.’s peer modeling
process, whereby seventh-graders were taught
to be tutors to their same-age tutees, and King’s
studies (e.g., 1991, 1992, 1994), which focused on
guiding students to generate questions and
elaborate thinking during the peer interaction
process. A developed body of research on
cooperative learning in the ’80s and early ’90s
(e.g., Johnson, Johnson, & Stanne, 1985, 1986;
Johnson, Johnson, Stanne, & Garibaldi, 1990)
also revealed the success of guided peer interac-
tions in improving student performance and
achievement by employing various group
processing strategies.
The peer interaction described in Webb’s (1989)
study was not specifically guided. It was charac-
terized by small groups of students who were
given materials to learn or a problem to solve and
expected to help each other learn the material.
They were not given specific roles, although they
may have had different abilities and background
experiences (Webb, 1989). Thus, their interaction
was contingent on voluntary engagement and
commitment to peer learning. Webb (1982, 1989)
found that, when learners were required to give
explanations to and ask questions of each other,
learning was enhanced. Similarly, Greene and
Land (2000) found that peer interaction during
open-ended learning was effective when group
members offered suggestions, negotiated ideas,
and shared their experiences. The process of ex-
planation presumably requires learners to clarify
concepts, reorganize thinking, and reconceptual-
ize the material. In the present study, both guided
(with question prompts) and unguided peer inter-
actions were studied to examine if they had dif-
ferential effects in facilitating ill-structured
problem solving.
SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 23
Purpose of the Study
Despite the justification for the use of question
prompts to facilitate problem-solving activities,
the relationship between questioning strategies
and ill-structured problem solving has not been
sufficiently studied. A review by Rosenshine et
al. (1996) revealed that a majority of the studies
in the area of questioning strategies was focused
on activating prior knowledge and improving
comprehension. King (1991) studied the effects
of guided, student-generated questions during
peer interactions on metacognitive skills,
knowledge construction, and problem solving.
However, in King’s (1991) study, the problem-
solving task was well structured and the sub-
jects were children. Our study aimed to extend
King’s research on questioning strategies to the
context of ill-structured problem solving, and
with an adult population (i.e., college students).
The purpose of this study was to investigate
the effects of (a) question prompts, (b) peer inter-
actions, and (c) the combined strategies of ques-
tion prompts and peer interactions in
scaffolding undergraduate students’ problem-
solving processes in an ill-structured task. The
problem-solving outcomes and processes inves-
tigated were (a) problem representation, (b)
problem solution, (c) making justifications, and
(d) monitoring and evaluation, which feature
the major processes of ill-structured problem
solving (e.g., Jonassen, 1997; Sinnott, 1989; Voss,
1988; Voss & Post, 1988).
The question prompts in this study referred
to a set of questions that were domain specific
and metacognitive-like, prompting students to
attend to important aspects of a problem at dif-
ferent phases and assisting them to plan,
monitor, and evaluate the solution process. They
were thus categorized into different functional
types, which closely paralleled the four proces-
ses of ill-structured problem solving. For ex-
ample, paralleling the process of monitoring and
evaluation were a series of questions asked
under the category, “Am I on the right track?”
The question prompts were delivered either in
printed format or through the Web.
The peer interaction strategy under inves-
tigation is defined as small groups of three or
four students who were given an ill-defined
problem and told to collaborate in solving it.
Students were not assigned specific roles. They
were expected to engage in the problem-solving
task and actively interact with each other to
negotiate meaning, share knowledge, and
develop solutions. There were two versions of
peer interactions: guided (with question
prompts) versus unguided.
The study examined the following questions:
1. Does using question prompts and peer inter-
actions separately or in combination affect
student problem-solving processes (problem
representation, solution development, jus-
tification, and monitoring and evaluation of
solutions) in an ill-structured task?
2. Does using question prompts and peer inter-
actions separately or in combination in-
fluence student cognition and metacognition
in the process of developing solutions to ill-
structured problems?
The following hypotheses were generated
from Question 1 and were tested:
Hypothesis 1. Students receiving question
prompts will demonstrate better problem-solving
performance in an ill-structured task than those
who do not receive question prompts in problem
representation, solution development, justifica-
tion, and monitoring and evaluation of solutions.
Previous research has shown that question
prompts can facilitate explanation construction
(King, 1991, 1992; King & Rosenshine, 1993), plan-
ning, monitoring, and evaluation (Davis & Linn,
2000; King, 1991; Schoenfeld, 1985), and making
justifications (Lin & Lehman, 1999).
Hypothesis 2. Students working with peers will
demonstrate better problem-solving perfor-
mance in an ill-structured task than those work-
ing individually in problem representation,
solution development, justification, and moni-
toring and evaluation of solutions. Peer model-
ing and interaction have been found to facilitate
self-regulation (Brown & Palincsar, 1989), dis-
tribute expertise, and foster reflection on multiple
perspectives (e.g., Roschelle, 1992; Webb, 1989).
Hypothesis 3. Students working with peers and
also receiving question prompts will demon-
strate better problem-solving performance in an
ill-structured task than all the other treatment
24 ETR&D, Vol. 51, No. 1
groups in problem representation, solution
development, justification, and monitoring and
evaluation of solutions. Previous research has
shown that guided peer interaction is more ef-
fective than unguided peer interaction (e.g,
Johnson et al., 1990; King, 1991; King & Rosen-
shine, 1993; Palincsar et al., 1987).
METHOD
Design
A quasi-experimental study, supplemented by
comparative, multiple-case studies, was em-
ployed to investigate the two research questions.
According to Greene, Caracelli, and Graham
(1989), using both quantitative and qualitative
methods helps a researcher to seek triangulation
of the results from different data sources; ex-
amine overlapping and different facets of a
phenomenon; discover paradoxes, contradic-
tions, and fresh perspectives; and expand the
scope and breadth of a study. The quasi-ex-
perimental study, designed to answer Research
Question 1, was conducted to measure students’
problem-solving outcomes in an ill-structured
task in the four problem-solving processes: (a)
problem representation, (b) solution develop-
ment, (c) justification, and (d) monitoring and
evaluation of solutions. The comparative, multi-
ple-case studies (Yin, 1989) served two pur-
poses: (a) to supplement and explain findings
for Research Question 1 and (b) to explore Re-
search Question 2 to gain insights into students’
problem-solving processes through think-aloud
protocols, interviews, and observations.
Participants and Context of the Study
Participants in the quasi-experimental design
were 117 undergraduate students recruited
from three class sections of an introductory
course in information sciences and technology
(IST) at a major university in northeastern
United States. Of these, 19 also participated in
the comparative, multiple-case studies. Most of
the students were freshmen majoring in infor-
mation sciences and technology, with a few stu-
dents from other majors.
The course was designed not only to provide
an overview of information sciences and tech-
nology, but also to integrate collaborative learn-
ing and problem-solving skills. It consisted of
both lecture and laboratory sessions. There were
two lecture sessions and one laboratory session
per week. The 75 min lecture sessions were held
by a professor of information sciences and tech-
nology. The 115-min laboratory session was con-
ducted by a teaching assistant. Each of the three
class sections was taught by a different profes-
sor. All were equally experienced in teaching the
subject. Two teaching assistants taught the
laboratory sections, one being the first author of
this study, who conducted the labs for one of the
class sections. All three class sections shared a
common curriculum and a core textbook, with
approximately 50 students in each.
The primary purpose of the laboratory ses-
sions was to provide hands-on experience and
technology skills related to information sciences
and technology. There were two major goals of
the laboratory sessions: (a) developing basic in-
formation technology skills through skill
module exercises (e.g., spreadsheets and
database management systems); and (b)
developing problem-solving and collaborative
learning skills through case studies.
The Quasi-Experimental Study
The four conditions of the quasi-experimental
study were (a) peer-question (PQ), (b) peer-con-
trol (PC), (c) individual-question (IQ), and (d) in-
dividual-control (IC). We measured students’
problem-solving performance in an ill-struc-
tured task, the output of which was a problem-
solving report.
Sampling and Treatment Assignment
In order to study students’ problem-solving per-
formance in the natural setting of the classroom,
the study was integrated into the curriculum
and administered during a 115-min laboratory
session. Each class section was randomly as-
signed as an intact group to one or two of the
treatment conditions. Because there were only
three classes to be used for four different treat-
ment conditions, one class had to be split into
two conditions. Fifteen participants in the IQ
SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 25
condition and 16 participants in the IC condition
were randomly assigned from Class A. Thirteen
groups comprising 48 participants in Class B were
assigned to the PQcondition while11 groups total-
ing 38 students in Class C were assigned to the PC
condition. Those groups were preexisting and pre-
viously formed by the course professors for the
class projects. The normal size of each group was 4
students; however, due to attrition and absence,
some variations in group size occurred, resulting
in some 3-member groups. In the PQ condition
there were nine 4-member groups and four 3-
member groups; in the PC condition there were
five 4-member groups and six 3-member groups.
While the uneven distribution of group sizes
across the two conditions might be a concern, ac-
cording to Lou, Abrami, and d’Apollonia’s (2001)
literature review, small groups of 3 to 4 members
were more effective than larger groups,which sug-
gested little difference between 3-member and 4-
member groups.
Because of various constraints when this study
was conducted, pretest data to establish
equivalence of the three intact groups were not
available. Instead, a brief survey was conducted at
the end of the study, which provided useful infor-
mation on the participants’ profiles and prior prob-
lem-solving experience across different conditions.
In the IQ condition, there were 14 (93%) IST major
students and 1 (7%) nonmajor student. In the IC
condition, there were 14 (87.5%) IST major students
and 2 (12.5%) nonmajor students. In the PQ condi-
tion, the IST major students were 43 (90%) as com-
pared with 5 (10%) nonmajor students (3 of them
majoring in related fields, such as computer
science). The PC condition consisted of 31 (82%)IST
major students as compared with 7(18%)nonmajor
students. More than 70% of students in the IQ, IC,
andPQconditions and about 60% ofthe students in
the PC condition reported that they had some kind
of previous problem-solving experience. The self-
rated problem-solving skills across the four condi-
tions were statistically analyzed and did not
indicate significant differences.
Measurement and Treatment Material
The ill-struc- tured problem-solving task material
was a complex, real-world problem related to
the domain of information science and technol-
ogy and developed by a course professor. The
materials were then validated by other IST
professors based on the major attributes of an ill-
structured problem (e.g., Chi & Glaser, 1985;
Jonassen, 1997; Kitchner, 1983; Sinnott, 1989;
Voss, 1988; Voss & Post, 1988). The problem
scenario for the task, as presented below, was ill-
structured in nature because subgoals were not
clear and the students had to generate and
define them. Additionally, the information
needed to solve the problem was not entirely
contained in the problem, the operators (actions
to be taken to solve the problems) were not
defined, and multiple solutions were possible.
Many customers complain that they have difficulty
finding items in a large supermarket (the W Store).
This problem especially affects college students, who
often have very little time for shopping. Since students
are major customers in this small college town, the
manager of the local store has hired you (or your team)
as a consultant to propose information technology-
based solutions to the problem. Your task is to make
suggestions about the features to be included in a new
information system. As part of this, you are to develop
a simple model illustrating your proposed system.
Based on the findings of a survey, the proposed infor-
mation system should be able to help customers find
items quickly, to present an overall view of all the
items on a shelf and an aisle, and to map out the
shortest route for getting all the items a customer
needs to purchase. There may be some other important
factors you need to consider.
Students across all the four conditions were
instructed to analyze the problem, propose in-
formation technology solutions, support their
solutions with evidence, and evaluate their solu-
tions. The output of the task was a two- to three-
page solution report, accompanied with a
diagram of their proposed system. In addition,
the students were asked to produce a prototype
of the database system described in their solu-
tion reports in order to satisfy the laboratory re-
quirement.
Task performance was measured based on
assigned conditions instead of individual learn-
ing outcomes. In other words, individual stu-
dents were measured according to their
individual solution reports, while groups were
measured as collective units according to their
group solution reports. The comparison of the
individual reports and the group collective
26 ETR&D, Vol. 51, No. 1
reports was made because this study was
focused, not on measuring individual learning
outcomes with different treatments, but rather
on student performance in different grouping
contexts: individual versus groups.
The question-prompt treatment material
(Appendix A) was a list of 10 major questions
generated from the problem by the course
professors, the IST experts. The question
prompts were then organized and categorized
into four types: (a) problem representation
prompts, (b) solution prompts, (c) justification
prompts, and (d) monitoring and evaluation
prompts. Each category of prompt included
some subquestions. For instance, included in the
category of problem representation prompts
(How do I define the problem?) were subques-
tions such as What are the parts of the problem?
and What are the technical components of the
problem?
An analytical rubric system developed by the
researchers was used to evaluate students’ prob-
lem-solving reports. It was based on the
theoretical framework of ill-structured problem
solving (e.g., Chi & Glaser, 1985; Jonassen, 1997;
Kitchner, 1983; Sinnott, 1989; Voss, 1988; Voss &
Post, 1988), and was reviewed and validated by
both the course professors and some experts in
the area of rubrics development. The rubrics
were modified and revised based on feedback
before being finalized. The rubric system had
four major constructs, each measuring one of the
four problem-solving processes in an ill-struc-
tured task: (a) problem representation, (b)
developing solutions, (c) making justification for
generating or selecting solutions, and (d)
monitoring and evaluating the problem space
and solutions. Each construct embodied specific
attributes, with performance specifications,
criteria, and ordinal values on different point
scales, such as 0–1–2–3 or 0–2–4. For instance,
the construct, making justifications, was
evaluated by two specific attributes: (a) con-
structing argument (ranging on a scale of 0–2–4),
and (b) providing evidence (ranging on a scale
of 0–1–2–3). In evaluating providing evidence, 0
was assigned if no evidence was provided, 1 was
assigned if evidence provided was not plausible, 2
was assigned if evidence was based on hypothetical
examples, and 3 was assigned if evidence was based
on previous experience or real examples. In evaluat-
ing constructing argument, 0 was assigned if no
argument was constructed, 2 was assigned if an ar-
gument was poorly constructed, and 4 was as-
signed if an argument was well constructed.
Because constructing argument was an im-
portant attribute, a 0–2–4 scale instead of a
continuous scale (i.e., 0–1–2) was used to dif-
ferentiate distinctively the students who
failed to provide an argument from those
who provided minimal or weak arguments,
and those who provided sound and cogent
arguments. The earned points for both con-
structing argument and providing evidence
were summated on a range of 0–7 points to
give an overall score for the construct,
making justification.
Administering the Study Sessions
The experimental study was administered by
the first author and a colleague in three 115-min
laboratory sessions in the same week. All the
study sessions were conducted in a classroom
equipped with laptop computers and LCD
projectors, where the participants had regular
lectures and lab sessions. In the first study ses-
sion, administered to the PQ condition, the par-
ticipants were told to work on the
problem-solving task in their preassigned
groups. The problem task materials and the
question prompts were posted on the course
Web site, to which students had access during
the session. At the same time, they were
provided with duplicate materials in paper for-
mat as a backup measure for any unexpected
technological problems. The students in this
condition were frequently reminded to refer to
the question prompts while solving the prob-
lem. In the second study session, administered
to the PC condition, students were also told to
work on the problem-solving task in their preas-
signed groups. The problem-solving task
material was delivered to them in the same way
as to the PQ group, but they were not provided
with question prompts. The third study session
was conducted with the IQ and IC conditions in
the same class section. These participants had
been randomly and previously assigned to
either the IQ or the IC condition, and they were
SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 27
seated on opposite sides of the room. The first
author passed out the problem-solving task in
handouts (together with question prompts for
the IQ condition), which were color-coded for
the two different conditions. The participants
were instructed to work individually.
Throughout all the study sessions, the re-
searcher attended to students’ questions that re-
lated to procedures or requirements for the
study only. No hints or assistance associated
with the problem were provided.
Quantitative Data Analysis
Three raters, including the first author,
evaluated the problem-solving reports. Before
evaluating, they reached a conceptual consensus
on how to interpret the scoring rubrics through
discussion and examples. The first author
evaluated all the reports; the other two raters
evaluated 70% of the reports. Any discrepancies
of assigned values were discussed among the
raters and the adjudicated score was used. Con-
sequently, a high consensus was reached.
A 2 × 2 multivariate analysis of variance
(MANOVA) was conducted to examine the ef-
fects of question prompts and peer interactions,
as well as the interactive effect of question
prompts and peer interactions. Wilks’s Lambda
F (Îą = .05) was used in interpreting the multi-
variate test results. The use of MANOVA was
justified because of an overall correlation among
the four dependent variables (problem repre-
sentation, developing solutions, making jus-
tifications, and monitoring and evaluation)
indicated by the results of Pearson’s correlation,
which were significant at the .01 level. As shown
by the results from the Box’s M Test and the
Levene’s Test, the assumption of equal variance
was met at the .05 alpha level, and thus met the
MANOVA testing assumption that the residual
errors follow a multivariate normal distribution
in the population. All the analyses were done
with the Statistical Package for the Social Scien-
ces (SPSS 11.0 for Windows).
The Comparative, Multiple-Case Studies
In the comparative, multiple-case studies, a case
was defined as an individual participant or a
peer group. Four individual participants and
four peer groups comprised eight separate
cases. Selective (discriminative) sampling
(Strauss & Corbin, 1998) was used to maximize
the representation of cases (Stake, 2000) and the
opportunities for comparative analysis across
different conditions. Within each condition, par-
ticipants were selected based on informed con-
sent, level of verbal interaction (with peer
conditions), and willingness to be audio taped or
videotaped for think-aloud protocols, observa-
tions, and interviews.
Data Collection Techniques and Procedures
Think- aloud protocols are the verbalization of
one’s thinking process (Ericsson & Simon, 1996).
In this study, they referred to an individual’s
verbalizations while engaged in the problem-
solving task. The verbal protocols were audio
recorded and later transcribed verbatim. The
first author administered the think-aloud ses-
sions to the four individuals (two in the IQ con-
dition and two in the IC condition) separately
and independently from the other participants
in the same condition. She demonstrated the
think-aloud procedure through examples and
made sure that the participant could follow the
procedure before beginning to record. The first
author occasionally reminded participants to
talk out loud or to raise their voice.
The observations were made on videotape
during the experimental study session, and cap-
tured both actions and verbalizations. The pur-
pose for videotaping the cases was to gain more
detailed understanding of the problems and
processes experienced by learners during ill-
structured problem solving and how the scaf-
folds might have supported them in this
process. The selected groups were observed
together with the other groups of participants in
the same classroom. The first author circulated
about the room and took notes on interactions
for both the PQ and the PC conditions.
Videotapes of the problem-solving processes of
the selected groups were later transcribed ver-
batim and analyzed.
Structured interview protocols, such as
what? how? and why? were used to prompt stu-
dents to recall their problem-solving processes
28 ETR&D, Vol. 51, No. 1
and the effects of the question prompts and peer
interactions; for example:
• Would you please tell me how you solved the
problem, in detail, for example, how you ap-
proached the problem at first and how you
came up with solutions?
• What were your reasons for selecting those
solutions?
• Did you find that the question provided was
helpful? In what ways? Please give examples.
• Did the group help you to solve this prob-
lem? How? Please give examples.
Except for one group, in which a member did
not want to be videotaped (and instead was
audio recorded), all the interviews with the peer
conditions were videotaped. All the interviews
with individual participants in the IQ and the IC
conditions were audio taped. The interview ses-
sions lasted approximately 30–40 min.
Qualitative Data Analysis
Pseudonyms were used for the eight selected cases
to protect the identity of the participants. All the
audio taped and videotaped data from the think-
aloud protocols, observations, and interviews were
transcribed for data analysis. Miles and
Huberman’s (1994) data analysis model, which in-
volves data reduction, data display, and conclusion
drawing and verification, was used to guide the
qualitative data analysis.Thedata analysis primari-
ly consisted of the following steps: reading and jot-
ting marginal notes on the transcripts; identifying
patterns and labeling concepts; organizing labeled
concepts into data display matrixes; identifying
themes; and drawing conclusions. For example, the
constructs and attributes of the rubrics were used
for labeling to examine the participants’ perfor-
mance in each of the problem-solving processes,
and new concepts were generated to examine stu-
dent behavior, reaction, and cognitive and
metacognitive process in the context of question
prompting or peer interaction. The next procedure
was to organize and display the labeled concepts so
that comparison could be made across different
cases and conditions. The data were displayed in
different ways to be viewed from different dimen-
sions, for instance, different conditions were com-
pared according to the four problem-solving
processes, and student behaviors and reactions
in peer or prompting conditions were organized
with data display matrixes (see examples in Ap-
pendix B). Finally, conclusions were drawn,
supported by examples.
RESULTS
Quantitative Outcomes
Table 1 summarizes the descriptive statistics for
the four problem-solving processes (the depend-
ent variables) by two factors: (a) individuals vs.
peers and (b) question prompts vs. no question
prompts. The n for peers in the table indicated
the number of groups, each of which consisted
of three to four students. The results of the eight
selected cases were also included in the data
analysis. Below is the statistical analysis report
in response to each of the hypotheses tested.
Question Prompting Effects
The hypothesis on question prompts predicted
that students who received question prompts
would perform significantly better than stu-
dents who did not receive question prompts.
The results of the two-way MANOVA
revealed a significant main effect for question
prompts, F (4, 48) = 17.371, p < .001, Ρ2
= .591,
which supported the hypothesis that students
who received question prompts would perform
significantly better than students who did not
receive question prompts. Further, a univariate
test of between-subjects effects revealed sig-
nificant effects of question prompts in all the
four problem-solving processes—problem rep-
resentation, F (1, 51) = 51.051, p < .001, MSE =
2.227, Ρ2
= 0.500; generating solutions, F (1, 51) =
21.429, p < .001, MSE = .960, Ρ2
= .296; making
justification, F (1, 51) = 32.929, p < .001, MSE =
1.424, Ρ2
= .392; and monitoring and evaluation,
F (1, 51) = 21.336, p < .001, MSE = 3.658, Ρ2
= .295.
Table 1 shows means and standard deviations of
question prompt treatment groups in com-
parison with control groups (no-question-
prompt conditions), indicating that students who
received question prompts significantly out-
performed those who did not receive question
prompts in all four problem-solving processes.
SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 29
Peer Interaction Effects
The hypothesis on peer interactions predicted
that students working with peers would per-
form significantly better than those working in-
dividually in problem-solving processes. The
results of the two-way MANOVA did not reveal
significant effects for peer interactions, F (4, 48) =
2.308, p = .071, Ρ2
= .161, and thus failed to sup-
port the hypothesis. However, as shown by
Table 1, the total mean score of the peers condi-
tion is much higher than that of the individuals
condition in problem representation. To further
explore the trend shown in the means, we ran a
post hoc univariate test. The result showed that
the peers significantly outperformed the in-
dividuals in problem representation, F (1, 51) =
6.991, p = .011, MSE = 2.227, Ρ2
= .121.
Interactive Effects of Question Prompts and
Peer Interactions
It was hypothesized that students working with
peers and also receiving question prompts
would demonstrate significantly better prob-
lem-solving skills than all the other conditions.
However, the result did not show significant in-
teractive effect of the two strategies, F (4, 48) =
1.298, p = .284, Ρ2
= .098, and thus failed to sup-
port the hypothesis. Despite the result, Table 1
showed that there was a trend that the PQ condi-
tion had higher means than the other conditions
(IQ, IC, and PC) in problem representation and
generating solutions.
Qualitative Findings
The eight cases selected for in-depth qualitative
study provided us with further insights into the
participants’ problem-solving performance in
different conditions to supplement the quantita-
tive findings. Below is a brief report of the per-
formance of the cases on the problem-solving
report, followed by a summary of the qualitative
findings on the effects of the question prompts
and the peer interactions.
Overall performance of cases on problem-solving
reports. Table 2 presents the raw scores of the
cases on the solution report. In general, the cases
in the question-prompt conditions (IQ and PQ)
showed higher raw scores than those in the non-
question-prompt conditions (IC and PC) in the
Table 1 Means and standard deviations for the problem-solving processes by question
prompts vs. no question prompts and individuals vs. peers.
Individuals Peers Total
M SD n M SD n M Sd n
Problem Representation
Question Prompts 4.47 (1.55) 15 6.23 (1.74) 13 5.29 (1.84) 28
No Question Prompts 2.25 (1.13) 16 2.64 (1.57) 11 2.41 (1.31) 27
Total 3.32 (1.74) 31 4.58 (2.45) 24 3.87 (2.15) 55
Generate Solutions
Question Prompts 6.13 (0.83) 15 7.08 (0.95) 13 6.57 (1.00) 28
No Question Prompts 5.38 (0.89) 16 5.36 (1.29) 11 5.37 (1.04) 27
Total 5.74 (0.93) 31 6.29 (1.40) 24 5.98 (1.18) 55
Make Justification
Question Prompts 5.00 (1.20) 15 5.54 (1.27) 13 5.25 (1.24) 28
No Question Prompts 3.63 (1.31) 16 3.18 (0.87) 11 3.44 (1.15) 27
Total 4.29 (1.42) 31 4.46 (1.61) 24 4.36 (1.50) 55
Monitor & Evaluate
Question Prompts 4.20 (2.11) 15 4.31 (2.18) 13 4.25 (2.10) 28
No Question Prompts 1.88 (1.59) 16 1.82 (1.72) 11 1.85 (1.61) 27
Total 3.00 (2.18) 31 3.17 (2.32) 24 3.07 (2.22) 55
Note. The possible ranges of scores for Problem Representation, Generating Solutions, Making Justification, and Monitoring
and Evaluating are 0–10, 0–8, 0–7, and 0–7 respectively.
30 ETR&D, Vol. 51, No. 1
four problem-solving processes; the cases in the
peer condition without question prompts (PC)
did not show advantages over the individual
conditions (IQ and IC) in any of the problem-
solving processes. The PQ condition showed the
best performance in problem representation and
generating solutions.
Effects of the Question Prompts
Compared with the no-question-prompt condi-
tions, the students who received question
prompts engaged in the following cognitive and
metacognitive activities: (a) making intentional
efforts to identify factors, information, and con-
straints during the problem-representation
process; (b) organizing and planning for the
solution process and articulating solutions ex-
plicitly; (c) constructing arguments grounded in
factors identified during problem representation
and providing justification for each suggestion
proposed; and (d) intentionally evaluating the
selected solutions, comparing alternatives, and
justifying the most viable solution. Because of
space limitations, the examples presented below
are selective and representative.
The qualitative results showed that the ques-
tion prompts had an effect of directing student
attention to important information they might
have overlooked, thus facilitating awareness of
what is known and not known. For example, in
their think-aloud protocols, both Cathy (Case 1,
IQ) and Joe (Case 2, IQ) were prompted by the
questions to seek additional information and
identify important factors that helped them rep-
resent the problem space and make connections
among different factors and constraints. By com-
parison, Case 7 (PC) was observed to start the
solution discussion right away while Case 8 (PC)
was observed to have some initial discussion of
the problem; however, both groups failed to ad-
dress problem subgoals, factors, and constraints
in their report. The question prompts seemed to
help students to analyze the problem and repre-
sent the problem space.
Consistent with Lin and colleagues’ (1999)
notion of process modeling, the question
prompts might have also served as expert
modeling to guide students through the prob-
lem-solving process. Joe (Case 2, IQ) said that
the question prompts were helpful for him to or-
ganize his thoughts. Perry (Case 6, PQ) men-
tioned that the problem seemed vague at first,
but the question prompts served as guidelines to
help his group break down the problem into
small steps. Without question prompts, the stu-
dents seemed to have difficulty representing the
problem and developing solutions. As sug-
gested by Paul in the interview (Case 3, IC), he
had difficulty making connections between dif-
ferent parts of the problem and organizing the
information coherently. It is well known that
novices organize knowledge differently from ex-
perts (Chi, Feltovich, & Glaser, 1981; Gick, 1986),
which suggested a need to help students engage
the problem more deeply than their limited
knowledge structures might normally permit.
It was observed that the question prompts
also helped the students to state their reasons for
their proposed solutions and make their think-
ing visible. As shown by the think-aloud
protocols, justification prompts such as What
should the system do? and What are my reasons
Table 2 The raw scores of the cases on their problem-solving reports.
Individual-Question Individual-Control Peer-Question Peer-Control
Case 1 Case 2 Case 3 Case 4 Case 5 Case 6 Case 7 Case 8
Problem Representation 3 3 1 2 8 8 2 1
(0–10 points)
Developing Solutions 7 6 6 5 8 8 6 6
(0–8 points)
Making Justification 6 6 4 5 6 6 1 4
(0–7 points)
Monitoring & Evaluating 4 7 2 3 3 7 1 3
(0–7 points)
SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 31
for the solutions? prompted Joe (IQ) and Cathy
(IQ) to articulate why they had selected par-
ticular solutions. The students in Case 6 (PQ)
pointed out that the justification prompts helped
them to clarify, justify, and write down the
reasons for their solutions that might not have
been made explicit otherwise. In contrast, Paul
(IC) mainly described how his proposed tech-
nological system would work instead of justify-
ing his solution, as observed in both his report
and his think-aloud protocols. Presumably,
prompting learners to articulate their thinking
helps them become more aware of what they
know, which then makes their thinking avail-
able to them for reflection, monitoring, and
revision (Scardamalia et al., 1989).
Additionally, as indicated by Cases 1 and 2
(IQ), and 6 (PQ), the monitoring and evaluation
prompts helped students think about alternative
solutions and their viability, an aspect often over-
looked by novice problem solvers (Feltovich et al.,
1996). As reported by Joe (IQ), the prompts helped
him to think about side effects that he would not
have considered otherwise. Matt in Case 6 (PQ)
mentioned in the interview that his group always
went back to the main problem to make sure they
were on the right track by following the questions;
thus they were able not only to discuss the risks,
pros, and cons of their proposed system, but alsoto
make justifications for the viability of their solu-
tions after comparing the alternatives. On the other
hand, although Paul (IC)and Joanne in Case 7 (PC)
mentioned in the interview that they had thought
about or discussed the feasibility of the possible
solutions, they failed to assess constraints and
think about alternative solutions in their reports.
Thus, the question prompts might have served as a
metacognitive function to help students recognize
a need to know and to evaluate the limitations of
their solutions. To further support this conclusion,
Case 5 (PQ) reportedly ignored the monitoring
and evaluation prompts, which might explain
why they failed to justify and evaluate their final
solutions in their report.
Effects of Peer Interactions
The effects of peer interactions mainly involved
(a) building upon each other’s ideas to develop
solutions, (b) providing multiple perspectives,
(c) asking questions and providing feedback,
and (d) benefiting from distributed cognition.
Interview data and field notes indicated that
cases in peer conditions typically started the
problem-solving process by brainstorming
ideas, which were presented in the form of ques-
tions or suggestions, such as How about. . .?
What do you think. . .? That’s right, we can also.
. . . Then, an idea was developed. The solution
reports showed that the peer conditions tended
to take into account a wider range of factors and
information in generating or selecting solutions.
Perry in Case 6 (PQ) mentioned in the interview
that a topic “went back and forth for a while”
before solutions were developed and selected.
Cathy (IQ) said that she generated possibilities
and then quickly narrowed them down to one
solution, but Cases 5 and 6 (PQ) stated that they
spent some time “figuring out” the best solution.
The process of working together with peers to
develop the best solution may have helped the
students to construct multiple problem spaces,
which is required for solving ill-structured
problems (Jonassen, 1997).
All the cases in the peer conditions men-
tioned in their interviews that they were ex-
posed to different inputs and perspectives from
the peers. For example, the peers in Case 6 (PQ)
were observed from field observations to spend
considerable time brainstorming different ideas,
weighing pros and cons of various solutions,
and providing feedback to suggestions. In
response, Matt (Case 6) said, “The group work
made you see something you couldn’t see on
your own.” As shown by the interview data, the
other peer groups also shared the same ex-
perience, such as Case 7 and 8 in the PC condi-
tion. We surmise that the peer interaction
process encouraged learners to identify alterna-
tive views or perspectives on the problem
(Jonassen, 1997), which in turn, helped them to
select the most relevant and useful solutions
(Sinnott, 1989).
Evidence from observations and interviews
showed that working collaboratively gave stu-
dents a chance to ask questions, offer sugges-
tions, elaborate thinking, and provide feedback,
although some students did not use this oppor-
tunity to offer critical comments. Mark (Case 5,
PQ) said that his group members asked ques-
32 ETR&D, Vol. 51, No. 1
tions and received feedback, which helped them to
develop solutions. Students in Case 7 (PC) were
seen discussing the needs, feasibility, pros and
cons of the proposed information technology sys-
tem through inquiry cycles of statements, ques-
tions, elaboration or feedback. Such a cycle, called
a “reflective toss” by van Zee and Minstrell (1997),
helps students make their meanings clear and
monitor their thinking process.
At the same time, videotaped observations of
Case 7 (PC) and Case 8 (PC) indicated that there
tended to be more agreement than disagreement
among the members, and few constructive sug-
gestions were made to each other. With Case 8
(PC), one student, Andy, reportedly had prior
classroom experience in solving a similar type of
problem, and hence tended to dominate the
group, and his peers tended to accept his sug-
gestions. Field observations and videotapes
showed that off-task joking and chatting took
place in Case 8. These data suggested that work-
ing together did not guarantee a positive peer
interaction process that challenged every mem-
ber to ask questions, elaborate thoughts, con-
struct arguments, or provide suggestions.
Another advantage of working with peers was
benefiting from each other’s expertise. To il-
lustrate, Bryan (Case 7, PC) was observed explain-
ing to his peers a technical term, and Devin (Case
8, PC) said that he had learned a problem-solving
strategy from Andy, his more experienced peer
group member. The students in Case6 (PQ)shared
their understanding about server and user inter-
face when the group was trying to build the
database prototype, and Perry (Case 6, PQ)
pointed out that the group work had contributed
to high-quality solutions because four heads
provided more input than one head. During the in-
terview, Case 5 (PQ) noted that an important
benefit tothem was the variety ofideas and the dif-
ferent areas of expertise that the peer interaction
process afforded. From field observations, we
found that the groups generally divided up their
work according toeach student’s expertise, such as
drafting documents, coming up with ideas, and
developing a database prototype, especially at the
latter part of the lab hours. These findings showed
that cognition could be distributed and amplified
across individuals with common goals and inter-
ests (Pea, 1993; Perkins, 1993; Salomon, 1993).
DISCUSSION
Both the quantitative and qualitative findings on
the effects of the question prompts support the
hypothesis that question prompts can facilitate
not only well-structured problem solving, as
shown by the studies of Schoenfeld (1985) and
King (1991), but also ill-structured problem solv-
ing. The findings confirm the results of the pre-
vious studies by King (e.g., 1989, 1991, 1992) and
King & Rosenshine (1993) that structured
guidance through questioning can enhance
knowledge representation. They are also consis-
tent with the research by Osman and Hannafin
(1994) and Wong (1985) indicating that ques-
tions could serve as cues to direct student atten-
tion to important information that the students
might have neglected. The effect of justification
prompts supports Lin and Lehman’s (1999) find-
ings that justification prompts directed student
attention to understanding when, why, and
how. The monitoring and evaluation prompts
help students to think about alternative solu-
tions and their viability, an aspect often over-
looked by novice problem solvers (Feltovich et
al., 1996).
However, one concern with regard to our
conclusions about the role of question prompts
is that the prompts may have provided a strong
advantage for students in the question-prompt
conditions to perform well on the assessment,
because the prompts focused on the same prob-
lem-solving processes as the rubrics. This limita-
tion of our study should be taken into
consideration when developing question
prompts and scoring rubrics for future work.
Yet, we believe that the alignment of the ques-
tion prompts and rubrics might serve as a guide
for how prompts should map assessment, if
prompting could work to help students achieve
expected learning outcomes.
The quantitative results of the students’ prob-
lem-solving performance conflict with previous
studies that showed positive effects of peer in-
teractions in developing learner cognition and
metacognition (Palincsar et al., 1987; Webb,
1982, 1989) and in improving performance and
achievement (Johnson et al., 1985, 1986, 1990).
However, the post hoc test indicating sig-
nificantly better performance of the peer condi-
SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 33
tions over the individual conditions in problem
representation suggests that peer interactions
can be an effective scaffolding strategy under
certain conditions. There might be several ex-
planations for the results of peer interactions
found in this study, including time constraints
and short period of treatment. From field obser-
vations, we found that most of the groups spent
the first part of the lab time exploring the prob-
lem space and brainstorming solutions; how-
ever, their remaining time was spent
distributing specific tasks among individuals,
with little interaction and feedback. The more
frequent peer interactions observed at the begin-
ning of the lab session might explain why peer
interaction conditions performed significantly
better than individual conditions in problem
representation. Another explanation might be
that the group report measure was not sensitive
enough to pick up the individual learning effects
produced by the group processing. If that is the
case, other measures are needed to effectively
evaluate the achievement of individuals as a
result of the peer interaction process.
In addition, certain conditions might be re-
quired for the peer interaction strategy to work
fully to facilitate problem solving. Group mem-
bers might not know how to ask productive
questions or how to elaborate thoughts, espe-
cially when their domain knowledge is limited
(Land, 2000). As indicated by Webb and
Palincsar (1996), in order for students to benefit
from collaboration, they must request and pro-
vide explanations, compare ideas, and sys-
tematically use and evaluate evidence. Webb
(1989) found that the students who learned most
from peer interactions were those who asked for
and provided explanations to others. King
(1989) found that small groups that asked task-
related questions and elaborated solutions were
more successful at problem solving than groups
that did not exhibit these behaviors. It was ob-
served in our study that different group
dynamics and compositions had different effects
on the peer interaction process. For example, in
Case 8, we saw that one student dominated the
peer interactions by doing all the explanation
while the other two students were relatively
compliant. But, although collaboration was
limited, these two students reported that they
learned a lot from the dominant group member,
a finding which again suggests a need for
measuring individual achievement during the
peer interaction process in future studies.
Consistent with previous research, the
qualitative data showed that students could
benefit from peer interactions in several ways,
such as building on each other’s ideas, eliciting
responses or explanation (Webb, 1989), sharing
multiple perspectives (Lin et al., 1999), and
taking advantage of each other’s knowledge and
competence (Pea, 1993). Thus, a remaining em-
pirical question is how we can maximize the
positive effects of peer interactions. Previous re-
search highlights the prominent role of the in-
structor in modeling effective comprehension
monitoring (Greene & Land, 2000; Palincsar &
Brown, 1984; Palincsar et al., 1987) and group-
processing strategies (Johnson et al., 1985, 1986,
1990). With Johnson et al.’s (1990) group-
processing strategy, for example, teachers and
students provide a review of a group session
and describe the member actions that were help-
ful and not helpful and decide what action to
continue or change. Our study indicated that
simply placing students into groups unguided
was not sufficient, and that providing students
with question prompts might also be insufficient
for effectively guiding interactions. As shown by
Case 5, even though the PQ condition was
provided with prompts, it still did not ensure
that every group member used them produc-
tively. Therefore, additional strategies, such as
instructor modeling and monitoring, might also
be needed to scaffold the processes of asking
questions, elaborating, explaining, constructing
arguments, providing constructive feedback,
and monitoring.
Despite the fact that the quantitative results
failed to support our hypothesis that the com-
bined use of question prompts and peer interac-
tions was most effective in facilitating
ill-structured problem-solving processes, the
qualitative data did point to some guiding and
modeling effects of the question prompts and
the potential benefits of peer interaction. The fact
that no main interactive effect was found might be
due to the small sample size. As noted in the
results section, there was atrendthat the PQ condi-
tion had higher means in problem representation
34 ETR&D, Vol. 51, No. 1
and generating solutions than the other condi-
tions. Increasing the sample size might increase
the statistical power that would result in a sig-
nificant interaction. Another speculation is that
question prompts might have effects in guiding
individuals within a group through the prob-
lem-solving process; however, it is unclear
whether the question prompts helped the stu-
dents to challenge each other with their own
questions, elicit more explanations, and enhance
argumentation and feedback. The question
prompts might be useful to guide both in-
dividuals and peers to focus on the questions
provided, but they might be of limited support
to help students generate questions of their own,
or elaborate and clarify each other’s under-
standing. This suggests that additional
strategies, such as King’s (1991, 1992, 1994)
question-generation and elaboration prompts,
and Palincsar and Brown’s (1984) modeling of
clarifying, predicting, and monitoring, are
needed for peers to use question prompts effec-
tively to maximize interactions through inter-
pretation, elaboration, explanation, negotiation,
and argumentation.
Modified replication of the study is neces-
sary, with increased number of treatments and
sample size, random sample assignment, and
use of pretest and posttest measures to ensure
equivalence of different conditions. Individual
achievement should be measured in addition to
group performance. Future study should also
investigate individual accountability in addition
to group goals in the peer interaction process
(see Slavin, 1989). In this study, we focused more
on group goals than on individual account-
ability, where the contribution of each member
is identifiable. Other research efforts involve ex-
amining the transfer effect of question prompts
on student-generated questioning and its effect
on ill-structured problem solving. This design
would help overcome the drawback of question
prompts providing advantages for the question-
prompt treatment conditions.
Xun Ge [xge@ou.edu] is Assistant Professor with the
Instructional Psychology and Technology Program at
The University of Oklahoma.
Susan M. Land [sland@psu.edu] is Assistant
Professor with the Instructional Systems Program at
The Pennsylvania State University.
REFERENCES
Bransford, J.D., Brown, A.L., & Cocking, R.R. (Eds.).
(2000). How people learn: Brain, mind, experience, and
school. Washington, DC: National Academy Press.
Bransford, J.D., & Stein, B.S. (1993). The IDEAL problem
solver: A guide for improving thinking, learning, and
creativity (2nd ed.). New York: W.H. Freeman and
Company.
Brown, A.L. (1987). Metacognition, executive control,
self-regulation, and other more mysterious
mechanisms. In F.E. Weinert & R.H. Kluwe (Eds.),
Metacognition, motivation, and understanding (pp. 65–
116). Hillsdale, NJ: Lawrence Erlbaum Associates.
Brown, A.L., & Palincsar, A.S. (1989). Guided,
cooperative learning and individual knowledge ac-
quisition. In L.B. Resnick (Ed.), Knowing, learning and
instruction: Essays in honor of Robert Glaser (pp. 393–
451). Hillsdale, NJ: Lawrence Erlbaum Associates.
Chi, M., Bassok, M., Lewis, M., Reimann, P., & Glaser,
R. (1989). Self-explanations: How students study
and use examples in learning to solve problems.
Cognitive Science, 13, 145–182.
Chi, M.T.H., Feltovich, P., & Glaser, R. (1981).
Categorization and representation of physics
problems by experts and novices. Cognition Science,
5, 121–152.
Chi, M.T.H., & Glaser, R. (1985). Problem solving
ability. In R.J. Sternberg (Ed.), Human abilities: An in-
formation processing approach (pp. 227–250). New
York: W.H. Freeman and Company.
Davis, E.A., & Linn, M. (2000). Scaffolding students’
knowledge integration: Prompts for reflection in
KIE. International Journal of Science Education, 22(8),
819–837.
Ericsson, K.A., & Simon, H.A. (1996). Protocol analysis:
Verbal reports as data revised edition. Cambridge, MA:
Massachusetts Institute of Technology.
Feltovich, P.J., Spiro, R.J., Coulson, R.L., & Feltovich, J.
(1996). Collaboration within and among minds:
Mastering complexity, individuality and in groups.
In T. Koschmann (Ed.), CSCL: Theory and practice of
an emerging paradigm (pp. 25–44). Mahwah, NJ:
Lawrence Erlbaum Associates.
Gick, M.L. (1986). Problem solving strategies. Educa-
tional Psychologist, 21(1&2), 99–120.
Gick, M.L., & Holyoak, K.J. (1980). Analogical problem
solving. Cognitive Psychology, 12, 306–355.
Greene, B.A., & Land, S.M. (2000). A qualitative
analysis of scaffolding use in a resource-based learn-
ing environment involving with the World Wide
Web. Journal of Educational Computing Research, 23(2),
151–180.
Greene, J.C., Caracelli, V.J., & Graham, W.F. (1989).
Toward a conceptual framework for mixed-method
evaluation designs. Educational Evaluation and Policy
Analysis, 11, 255–274.
Hannafin, M., Land, S., & Oliver, K. (1999). Open
learning environments: Foundations, methods, and
models. In C.M. Reigeluth (Ed.), Instructional-design
theories and models: Vol. 2. A new paradigm of instruc-
SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 35
tional theory (pp. 115–140). Mahwah, NJ: Lawrence
Erlbaum Associates.
Johnson, R.T., Johnson, D.W., & Stanne, M.B. (1985).
Effects of cooperative, competitive, and in-
dividualistic goal structures on computer-assisted
instruction. Journal of Educational Psychology, 77(6),
668–677.
Johnson, R.T., Johnson, D.W., & Stanne, M.B. (1986).
Comparison of computer-assisted cooperative, com-
petitive, and individualistic learning. American
Educational Research Journal, 23(3), 382–392.
Johnson, D.W., Johnson, R.T., Stanne, M.B., & Garibal-
di, A. (1990). Impact of group processing on achieve-
ment in cooperative groups. The Journal of Social
Psychology, 130(4), 507–516.
Jonassen, D.H. (1997). Instructional design models for
well-structured and ill-structured problem-solving
learning outcomes. Educational Technology Research
and Development, 45(1), 65–94.
Jonassen, D.H., Beissner, K., & Yacci, M. (1993). Struc-
tural knowledge. Hillsdale, NJ: Lawrence Erlbaum
Associates.
King, A. (1989). Verbal interaction and problem solv-
ing within computer-assisted cooperative learning
group. Journal of Educational Computing Research.
5(1), 1–15.
King, A. (1991). Effects of training in strategic ques-
tioning on children’s problem-solving performance.
Journal of Educational Psychology, 83(3), 307–317.
King, A. (1992). Facilitating elaborative learning
through guided student-generated questioning.
Educational Psychologist, 27(1), 111–126.
King, A. (1994). Guiding knowledge construction in
the classroom: Effects of teaching children how to
question and how to explain. American Educational
Research Journal, 31(2), 338–368.
King, A., & Rosenshine, B. (1993). Effect of guided
cooperative questioning on children’s knowledge
construction. Journal of Experimental Education, 61(2),
127–148.
Kitchner, K.S. (1983). Cognition, metacognition, and
epistemistic cognition: A three-level model of cogni-
tive processing. Human Development, 26, 222–232.
Kitchner, K.S., & King, P.M. (1981). Reflective judg-
ment: Concepts of justification and their relationship
to age and education. Journal of Applied Developmen-
tal Psychology, 2, 89–116.
Land, S.M. (2000). Cognitive requirements for learning
with open-ended learning environments. Education-
al Technology Research and Development, 48(3), 61–78.
Lin, X., Hmelo, C., Kinzer, C.K., & Secules, T.J. (1999).
Designing technology to support reflection. Educa-
tional Technology Research and Development, 47(3), 43–
62.
Lin, X., & Lehman, J.D. (1999). Supporting learning of
variable control in a computer-based biology en-
vironment: Effects of prompting college students to
reflect on their own thinking. Journal of Research in
Science Teaching, 3(7), 837–858.
Lou, Y., Abrami, P.C., & d’Apollonia, S. (2001). Small
group and individual learning with technology: A
meta-analysis. Review of Educational Research, 71(3),
449–521.
Miles, M.B., & Huberman, A.M. (Eds.). (1994). An ex-
panded sourcebook: Qualitative data analysis (2nd ed.).
Thousand Oaks, CA: Sage Publications.
Osman, M.E., & Hannafin, M.J. (1994). Effects of ad-
vance questioning and prior knowledge on science
learning. Journal of Educational Research, 88(1), 5–13.
Palincsar, A.S., & Brown, A.L. (1984). Reciprocal teach-
ing of comprehension-fostering and comprehen-
sion-monitoring activities. Cognition and Instruction,
2, 117–175.
Palincsar, A.S., Brown, A.L., & Martin, S.M. (1987).
Peer interaction in reading comprehension instruc-
tion. Educational Psychologist, 22(3–4), 231–253.
Pea, R. (1993). Practices of distributed intelligence and
designs for education. In G. Salomon (Ed.), Dis-
tributed cognitions: Psychological and educational con-
siderations (pp. 47–87). Cambridge, UK: Cambridge
University Press.
Perkins, D.N. (1993). Persons-plus: A distributed view
of thinking and learning. In G. Salomon (Ed.), Dis-
tributed cognitions: Psychological and educational con-
siderations (pp. 88–110). Cambridge, UK: Cambridge
University Press.
Pressley, M., & McCormick, C.B. (1987). Advanced
educational psychology for educators, researchers, and
policy makers. New York: HarperCollins.
Roschelle, J. (1992). Learning by collaborating: Conver-
gent conceptual change. Journal of Learning Sciences,
2, 235–276.
Rosenshine, B., Meister, C., & Chapman, S. (1996).
Teaching students to generate questions: A review
of the intervention studies. Review of Educational Re-
search, 66(2), 181–221.
Salomon, G. (1993). No distribution without
individuals’ cognition: A dynamic interactional
view. In G. Salomon (Ed.), Distributed cognitions:
Psychological and educational considerations (pp. 111–
138). Cambridge, UK: Cambridge University Press.
Scardamalia, M., Bereiter, C., McLean, R.S., Swallow,
J., & Woodruff, E. (1989). Computer-supported in-
tentional learning environments. Journal of Educa-
tional Computing Research, 5, 51–68.
Scardamalia, M., Bereiter, C., & Steinbach, R. (1984).
Teachability of reflective processes in written com-
position. Cognitive Science, 8, 173–190.
Schoenfeld, A.H. (1985). Mathematical problem-solving.
San Diego, CA: Academic Press.
Sinnott, J.D. (1989). A model for solution of ill-struc-
tured problems: Implications for everyday and
abstract problem solving. In J.D. Sinott (Ed.),
Everyday problem solving: Theory and application (pp.
72–99). New York: Praeger.
Slavin, R.E. (1989). Cooperative learning and student
achievement. In R.E. Slavin (Ed.), School and class-
room organization (pp. 129–156). Hillsdale, NJ:
Lawrence Erlbaum Associates.
Stake, R.E. (2000). Case studies. In N.K. Denzin & Y.S.
36 ETR&D, Vol. 51, No. 1
Lincoln (Eds.), Handbook of qualitative research (2nd
ed., pp. 435–454). Thousand Oaks, CA: Sage Publica-
tions.
Strauss, A., & Corbin, J. (Eds.). (1998). Basics of qualita-
tive research: Techniques and procedures for developing
grounded theory. Thousand Oaks, CA: Sage Publica-
tions.
van Zee, E., & Minstrell, J. (1997). Using questioning to
guide student thinking. The Journal of the Learning
Sciences, 6(2), 227–269.
Voss, J.F. (1988). Problem solving and reasoning in ill-
structured domains. In C. Antaki (Ed.), Analyzing
everyday explanation: A casebook of methods (pp. 74–
93). London: Sage Publications.
Voss, J.F., & Post, T.A. (1988). On the solving of ill-
structured problems. In M.H. Chi, R. Glaser, & M.J.
Farr (Eds.), The nature of expertise (pp. 261–285).
Hillsdale, NJ: Lawrence Erlbaum Associates.
Voss, J.F., Wolfe, C.R., Lawrence, J.A., & Engle, R.A.
(1991). From representation to decision: An analysis
of problem solving in international relations. In R.J.
Sternberg & P.A. Frensch (Eds.), Complex problem
solving: Principles and mechanisms (pp. 119–158).
Hillsdale, NJ: Lawrence Erlbaum Associates.
Vygotsky, L.S. (1978). Mind in society. Cambridge, MA:
Harvard University Press.
Webb, N.M. (1982). Group composition, group interac-
tion and achievement in cooperative small groups.
Journal of Educational Psychology, 74, 475–484.
Webb, N.M. (1989). Peer interaction and learning in
small groups. International Journal of Educational Re-
search, 13, 21–39.
Webb, N.M., & Palincsar, A.S. (1996). Group processes
in the classroom. In D.C. Berliner & R.C. Calfee
(Eds.), Handbook of educational psychology (pp. 841–
873). New York: Simon & Schuster Macmillan.
Wineburg, S.S. (1998). Reading Abraham Lincoln: An
expert-expert study in the interpretation of historical
texts. Cognitive Science, 22, 319–346.
Wong, B.Y.L. (1985). Self-questioning instructional re-
search: A review. Review of Educational Research, 55,
227–268.
Yin, R.K. (1989). Case study research: Design and methods
(2nd ed.). Thousand Oaks, CA: Sage Publications.
Appendix A The Question Prompt Treatment Material
Something to Think About . . .
As you work through the problem, please read and think about the following questions.
How do I define the problem?
1. What are the parts of the problem?
2. What are the technical components?
3. What information do you need for this system? How will the system be used, by whom, and for what?
• Who would be the users?
• What information do you expect to be needed by the users?
• What level of prior knowledge do you expect the users to have?
• How would a user ideally interact with the proposed system?
What solutions do I need to generate?
4. What should the system do?
5. How should the different technical components of the proposed system interrelate?
6. What are the risks?
What are my reasons or what is my argument for my proposed solution?
7. How would I justify this specific system design? For example, if I develop a web-based solution, can
I explain why I took that approach?
8. Do I have evidence to support my solution (that is, the specific IT system I have proposed)? What is
my chain of reasoning to support my solution?
Am I on the right track?
9. Have I discussed both the technical components and the issues with use, for example, usability and
effectiveness?
10. Are there alternative solutions?
• What are they?
• How are they compared with my proposed system?
• What argument can I make or what evidence do I have to convince the manager that my solution
is the most viable?
SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 37
Appendix B Data display example: The effects of peer interactions on problem-solving
processes.
Processes or
Peer Interactions Reactions or Consequences Data Sources Thinking Influenced
Brainstorm Come up with different ideas to solve Interviews; Problem representation
the problem (e.g., Case 5, 6, 7) Observation (cognitive thinking)
Ask questions Explain, e.g.,: Developing solutions
Case 8: explain what PDA is Observation (cognitive thinking)
Case 5: what server is Interview Making justifications
Examine thinking process, solutions, etc.: (metacognitive skills)
Case 7: cause one to examine the Interview Monitoring and
feasibility of a solution evaluating solution
Case 8: evaluate the solutions, modify Observation process
the solution accordingly (metacognitive skills)
Provide feedback Monitor thinking process, e.g.; Monitoring and
Case 6: Examine pros and cons, decide Interviews evaluating solution
what systems to use process
(e.g., external vs. internal) (metacognitive skills)
Case 5: Test the system
Reflect on one’s thinking, e.g.:
Case 5: See things that could not have Interviews
thought about
Elaborate ideas Build on each other’s ideas Observations Developing solutions
(e.g., Case 7, 8, 6) Interviews (cognitive thinking)
Problem representation
(cognitive thinking)
Make suggestions Build on each other’s ideas for developing Observations Developing solutions
solutions (Case 7, 8, 6) Interviews (cognitive thinking)
See things from other perspectives Monitoring and
Think about things that could not have Interview evaluating solution
been thought about (Case 5) process
(metacognitive skills)
Share ideas Get multiple perspectives (Case 5, 6, 8) Interviews Monitoring and
Share expertise: evaluating solution
Case 6: my ideas combined with other Interviews process
people’s ideas (metacognitive skills)
Cases 7, 8: take expertise from Developing solutions
each other (cognitive thinking)
38 ETR&D, Vol. 51, No. 1

More Related Content

Similar to A Conceptual Framework For Scaffolding III-Structured Problem-Solving Processes Using Question Prompts And Peer Interactions

Critical Reflection And The Reflective Practitioner
Critical Reflection And The Reflective PractitionerCritical Reflection And The Reflective Practitioner
Critical Reflection And The Reflective PractitionerYan Karaliotas
 
An Exploratory Study Of A Story Problem Assessment Understanding Children S ...
An Exploratory Study Of A Story Problem Assessment  Understanding Children S ...An Exploratory Study Of A Story Problem Assessment  Understanding Children S ...
An Exploratory Study Of A Story Problem Assessment Understanding Children S ...Angie Miller
 
Algebraic Thinking A Problem Solving Approach
Algebraic Thinking  A Problem Solving ApproachAlgebraic Thinking  A Problem Solving Approach
Algebraic Thinking A Problem Solving ApproachCheryl Brown
 
2docs.lib.purdue.edujps 2015 Volume 8Journal of Problem.docx
2docs.lib.purdue.edujps 2015  Volume 8Journal of Problem.docx2docs.lib.purdue.edujps 2015  Volume 8Journal of Problem.docx
2docs.lib.purdue.edujps 2015 Volume 8Journal of Problem.docxgilbertkpeters11344
 
An overview of the field of education
An overview of the field of educationAn overview of the field of education
An overview of the field of educationrasyid Ridha
 
An overview of the field of education
An overview of the field of educationAn overview of the field of education
An overview of the field of educationrasyid Ridha
 
A Process For Solving Ill-Structured Problem Supported By Ontology And Softwa...
A Process For Solving Ill-Structured Problem Supported By Ontology And Softwa...A Process For Solving Ill-Structured Problem Supported By Ontology And Softwa...
A Process For Solving Ill-Structured Problem Supported By Ontology And Softwa...Joshua Gorinson
 
Analysis Of Precurrent Skills In Solving Mathematics Story Problems
Analysis Of Precurrent Skills In Solving Mathematics Story ProblemsAnalysis Of Precurrent Skills In Solving Mathematics Story Problems
Analysis Of Precurrent Skills In Solving Mathematics Story ProblemsSara Alvarez
 
Beyond Show And Tell to Problem Solving: Exploring the Discrepancies between ...
Beyond Show And Tell to Problem Solving: Exploring the Discrepancies between ...Beyond Show And Tell to Problem Solving: Exploring the Discrepancies between ...
Beyond Show And Tell to Problem Solving: Exploring the Discrepancies between ...Prince Armah, PhD
 
Action Research In Second Language Teacher Education
Action Research In Second Language Teacher EducationAction Research In Second Language Teacher Education
Action Research In Second Language Teacher EducationCynthia King
 
An Interactive Online Course A Collaborative Design Model
An Interactive Online Course  A Collaborative Design ModelAn Interactive Online Course  A Collaborative Design Model
An Interactive Online Course A Collaborative Design ModelRichard Hogue
 
Research on Graphic Organizers
Research on Graphic OrganizersResearch on Graphic Organizers
Research on Graphic Organizersjwalts
 
Transformations of scaffolding concept in socio-technical systems.pdf
Transformations of scaffolding concept in socio-technical systems.pdfTransformations of scaffolding concept in socio-technical systems.pdf
Transformations of scaffolding concept in socio-technical systems.pdfKai Pata
 
An Instrument To Support Thinking Critically About Critical Thinking In Onlin...
An Instrument To Support Thinking Critically About Critical Thinking In Onlin...An Instrument To Support Thinking Critically About Critical Thinking In Onlin...
An Instrument To Support Thinking Critically About Critical Thinking In Onlin...Daniel Wachtel
 
An ICT Environment To Assess And Support Students Mathematical Problem-Solvi...
An ICT Environment To Assess And Support Students  Mathematical Problem-Solvi...An ICT Environment To Assess And Support Students  Mathematical Problem-Solvi...
An ICT Environment To Assess And Support Students Mathematical Problem-Solvi...Vicki Cristol
 
Assessing and promoting computer-supported collaborative learning
Assessing and promoting computer-supported collaborative learningAssessing and promoting computer-supported collaborative learning
Assessing and promoting computer-supported collaborative learningtelss09
 
Is it True? Always? Supporting Reasoning and Proof Focused Collaboration amon...
Is it True? Always? Supporting Reasoning and Proof Focused Collaboration amon...Is it True? Always? Supporting Reasoning and Proof Focused Collaboration amon...
Is it True? Always? Supporting Reasoning and Proof Focused Collaboration amon...Nicole Rigelman
 
Approaches of Student centred Startegies .pdf
Approaches of Student centred Startegies .pdfApproaches of Student centred Startegies .pdf
Approaches of Student centred Startegies .pdfAbdelmoneim Abusin
 
The relationship between reflective thinking and learning styles among sample...
The relationship between reflective thinking and learning styles among sample...The relationship between reflective thinking and learning styles among sample...
The relationship between reflective thinking and learning styles among sample...Alexander Decker
 
Toolbox for better teaching
Toolbox for better teachingToolbox for better teaching
Toolbox for better teachingSjoerd Heeringa
 

Similar to A Conceptual Framework For Scaffolding III-Structured Problem-Solving Processes Using Question Prompts And Peer Interactions (20)

Critical Reflection And The Reflective Practitioner
Critical Reflection And The Reflective PractitionerCritical Reflection And The Reflective Practitioner
Critical Reflection And The Reflective Practitioner
 
An Exploratory Study Of A Story Problem Assessment Understanding Children S ...
An Exploratory Study Of A Story Problem Assessment  Understanding Children S ...An Exploratory Study Of A Story Problem Assessment  Understanding Children S ...
An Exploratory Study Of A Story Problem Assessment Understanding Children S ...
 
Algebraic Thinking A Problem Solving Approach
Algebraic Thinking  A Problem Solving ApproachAlgebraic Thinking  A Problem Solving Approach
Algebraic Thinking A Problem Solving Approach
 
2docs.lib.purdue.edujps 2015 Volume 8Journal of Problem.docx
2docs.lib.purdue.edujps 2015  Volume 8Journal of Problem.docx2docs.lib.purdue.edujps 2015  Volume 8Journal of Problem.docx
2docs.lib.purdue.edujps 2015 Volume 8Journal of Problem.docx
 
An overview of the field of education
An overview of the field of educationAn overview of the field of education
An overview of the field of education
 
An overview of the field of education
An overview of the field of educationAn overview of the field of education
An overview of the field of education
 
A Process For Solving Ill-Structured Problem Supported By Ontology And Softwa...
A Process For Solving Ill-Structured Problem Supported By Ontology And Softwa...A Process For Solving Ill-Structured Problem Supported By Ontology And Softwa...
A Process For Solving Ill-Structured Problem Supported By Ontology And Softwa...
 
Analysis Of Precurrent Skills In Solving Mathematics Story Problems
Analysis Of Precurrent Skills In Solving Mathematics Story ProblemsAnalysis Of Precurrent Skills In Solving Mathematics Story Problems
Analysis Of Precurrent Skills In Solving Mathematics Story Problems
 
Beyond Show And Tell to Problem Solving: Exploring the Discrepancies between ...
Beyond Show And Tell to Problem Solving: Exploring the Discrepancies between ...Beyond Show And Tell to Problem Solving: Exploring the Discrepancies between ...
Beyond Show And Tell to Problem Solving: Exploring the Discrepancies between ...
 
Action Research In Second Language Teacher Education
Action Research In Second Language Teacher EducationAction Research In Second Language Teacher Education
Action Research In Second Language Teacher Education
 
An Interactive Online Course A Collaborative Design Model
An Interactive Online Course  A Collaborative Design ModelAn Interactive Online Course  A Collaborative Design Model
An Interactive Online Course A Collaborative Design Model
 
Research on Graphic Organizers
Research on Graphic OrganizersResearch on Graphic Organizers
Research on Graphic Organizers
 
Transformations of scaffolding concept in socio-technical systems.pdf
Transformations of scaffolding concept in socio-technical systems.pdfTransformations of scaffolding concept in socio-technical systems.pdf
Transformations of scaffolding concept in socio-technical systems.pdf
 
An Instrument To Support Thinking Critically About Critical Thinking In Onlin...
An Instrument To Support Thinking Critically About Critical Thinking In Onlin...An Instrument To Support Thinking Critically About Critical Thinking In Onlin...
An Instrument To Support Thinking Critically About Critical Thinking In Onlin...
 
An ICT Environment To Assess And Support Students Mathematical Problem-Solvi...
An ICT Environment To Assess And Support Students  Mathematical Problem-Solvi...An ICT Environment To Assess And Support Students  Mathematical Problem-Solvi...
An ICT Environment To Assess And Support Students Mathematical Problem-Solvi...
 
Assessing and promoting computer-supported collaborative learning
Assessing and promoting computer-supported collaborative learningAssessing and promoting computer-supported collaborative learning
Assessing and promoting computer-supported collaborative learning
 
Is it True? Always? Supporting Reasoning and Proof Focused Collaboration amon...
Is it True? Always? Supporting Reasoning and Proof Focused Collaboration amon...Is it True? Always? Supporting Reasoning and Proof Focused Collaboration amon...
Is it True? Always? Supporting Reasoning and Proof Focused Collaboration amon...
 
Approaches of Student centred Startegies .pdf
Approaches of Student centred Startegies .pdfApproaches of Student centred Startegies .pdf
Approaches of Student centred Startegies .pdf
 
The relationship between reflective thinking and learning styles among sample...
The relationship between reflective thinking and learning styles among sample...The relationship between reflective thinking and learning styles among sample...
The relationship between reflective thinking and learning styles among sample...
 
Toolbox for better teaching
Toolbox for better teachingToolbox for better teaching
Toolbox for better teaching
 

More from Sandra Long

Essay On Teachers Day (2023) In English Short, Simple Best
Essay On Teachers Day (2023) In English Short, Simple BestEssay On Teachers Day (2023) In English Short, Simple Best
Essay On Teachers Day (2023) In English Short, Simple BestSandra Long
 
10 Best Printable Handwriting Paper Template PDF For Free At Printablee
10 Best Printable Handwriting Paper Template PDF For Free At Printablee10 Best Printable Handwriting Paper Template PDF For Free At Printablee
10 Best Printable Handwriting Paper Template PDF For Free At PrintableeSandra Long
 
Buy College Application Essay. Online assignment writing service.
Buy College Application Essay. Online assignment writing service.Buy College Application Essay. Online assignment writing service.
Buy College Application Essay. Online assignment writing service.Sandra Long
 
FREE 6 Sample Informative Essay Templates In MS Word
FREE 6 Sample Informative Essay Templates In MS WordFREE 6 Sample Informative Essay Templates In MS Word
FREE 6 Sample Informative Essay Templates In MS WordSandra Long
 
Small Essay On Education. Small Essay On The Educ
Small Essay On Education. Small Essay On The EducSmall Essay On Education. Small Essay On The Educ
Small Essay On Education. Small Essay On The EducSandra Long
 
Where Can I Buy A Persuasive Essay, Buy Per
Where Can I Buy A Persuasive Essay, Buy PerWhere Can I Buy A Persuasive Essay, Buy Per
Where Can I Buy A Persuasive Essay, Buy PerSandra Long
 
Chinese Writing Practice Paper With Pinyin Goodnot
Chinese Writing Practice Paper With Pinyin GoodnotChinese Writing Practice Paper With Pinyin Goodnot
Chinese Writing Practice Paper With Pinyin GoodnotSandra Long
 
Elephant Story Writing Sample - Aus - Elephant W
Elephant Story Writing Sample - Aus - Elephant WElephant Story Writing Sample - Aus - Elephant W
Elephant Story Writing Sample - Aus - Elephant WSandra Long
 
391505 Paragraph-Writ. Online assignment writing service.
391505 Paragraph-Writ. Online assignment writing service.391505 Paragraph-Writ. Online assignment writing service.
391505 Paragraph-Writ. Online assignment writing service.Sandra Long
 
Get Essay Writing Assignment Help Writing Assignments, Essay Writing
Get Essay Writing Assignment Help Writing Assignments, Essay WritingGet Essay Writing Assignment Help Writing Assignments, Essay Writing
Get Essay Writing Assignment Help Writing Assignments, Essay WritingSandra Long
 
Ampad EZ Flag Writing Pad, LegalWide, 8 12 X 11, Whi
Ampad EZ Flag Writing Pad, LegalWide, 8 12 X 11, WhiAmpad EZ Flag Writing Pad, LegalWide, 8 12 X 11, Whi
Ampad EZ Flag Writing Pad, LegalWide, 8 12 X 11, WhiSandra Long
 
The Federalist Papers Writers Nozna.Net. Online assignment writing service.
The Federalist Papers Writers Nozna.Net. Online assignment writing service.The Federalist Papers Writers Nozna.Net. Online assignment writing service.
The Federalist Papers Writers Nozna.Net. Online assignment writing service.Sandra Long
 
Whoever Said That Money CanT Buy Happiness, Simply DidnT
Whoever Said That Money CanT Buy Happiness, Simply DidnTWhoever Said That Money CanT Buy Happiness, Simply DidnT
Whoever Said That Money CanT Buy Happiness, Simply DidnTSandra Long
 
How To Write An Essay In College Odessa Howtowrit
How To Write An Essay In College Odessa HowtowritHow To Write An Essay In College Odessa Howtowrit
How To Write An Essay In College Odessa HowtowritSandra Long
 
How To Write A Career Research Paper. Online assignment writing service.
How To Write A Career Research Paper. Online assignment writing service.How To Write A Career Research Paper. Online assignment writing service.
How To Write A Career Research Paper. Online assignment writing service.Sandra Long
 
Columbia College Chicago Notable Alumni - INFOLEARNERS
Columbia College Chicago Notable Alumni - INFOLEARNERSColumbia College Chicago Notable Alumni - INFOLEARNERS
Columbia College Chicago Notable Alumni - INFOLEARNERSSandra Long
 
001 P1 Accounting Essay Thatsnotus. Online assignment writing service.
001 P1 Accounting Essay Thatsnotus. Online assignment writing service.001 P1 Accounting Essay Thatsnotus. Online assignment writing service.
001 P1 Accounting Essay Thatsnotus. Online assignment writing service.Sandra Long
 
Essay Writing Tips That Will Make Col. Online assignment writing service.
Essay Writing Tips That Will Make Col. Online assignment writing service.Essay Writing Tips That Will Make Col. Online assignment writing service.
Essay Writing Tips That Will Make Col. Online assignment writing service.Sandra Long
 
Pin On Essay Writer Box. Online assignment writing service.
Pin On Essay Writer Box. Online assignment writing service.Pin On Essay Writer Box. Online assignment writing service.
Pin On Essay Writer Box. Online assignment writing service.Sandra Long
 
How To Write A Funny Essay For College - Ai
How To Write A Funny Essay For College - AiHow To Write A Funny Essay For College - Ai
How To Write A Funny Essay For College - AiSandra Long
 

More from Sandra Long (20)

Essay On Teachers Day (2023) In English Short, Simple Best
Essay On Teachers Day (2023) In English Short, Simple BestEssay On Teachers Day (2023) In English Short, Simple Best
Essay On Teachers Day (2023) In English Short, Simple Best
 
10 Best Printable Handwriting Paper Template PDF For Free At Printablee
10 Best Printable Handwriting Paper Template PDF For Free At Printablee10 Best Printable Handwriting Paper Template PDF For Free At Printablee
10 Best Printable Handwriting Paper Template PDF For Free At Printablee
 
Buy College Application Essay. Online assignment writing service.
Buy College Application Essay. Online assignment writing service.Buy College Application Essay. Online assignment writing service.
Buy College Application Essay. Online assignment writing service.
 
FREE 6 Sample Informative Essay Templates In MS Word
FREE 6 Sample Informative Essay Templates In MS WordFREE 6 Sample Informative Essay Templates In MS Word
FREE 6 Sample Informative Essay Templates In MS Word
 
Small Essay On Education. Small Essay On The Educ
Small Essay On Education. Small Essay On The EducSmall Essay On Education. Small Essay On The Educ
Small Essay On Education. Small Essay On The Educ
 
Where Can I Buy A Persuasive Essay, Buy Per
Where Can I Buy A Persuasive Essay, Buy PerWhere Can I Buy A Persuasive Essay, Buy Per
Where Can I Buy A Persuasive Essay, Buy Per
 
Chinese Writing Practice Paper With Pinyin Goodnot
Chinese Writing Practice Paper With Pinyin GoodnotChinese Writing Practice Paper With Pinyin Goodnot
Chinese Writing Practice Paper With Pinyin Goodnot
 
Elephant Story Writing Sample - Aus - Elephant W
Elephant Story Writing Sample - Aus - Elephant WElephant Story Writing Sample - Aus - Elephant W
Elephant Story Writing Sample - Aus - Elephant W
 
391505 Paragraph-Writ. Online assignment writing service.
391505 Paragraph-Writ. Online assignment writing service.391505 Paragraph-Writ. Online assignment writing service.
391505 Paragraph-Writ. Online assignment writing service.
 
Get Essay Writing Assignment Help Writing Assignments, Essay Writing
Get Essay Writing Assignment Help Writing Assignments, Essay WritingGet Essay Writing Assignment Help Writing Assignments, Essay Writing
Get Essay Writing Assignment Help Writing Assignments, Essay Writing
 
Ampad EZ Flag Writing Pad, LegalWide, 8 12 X 11, Whi
Ampad EZ Flag Writing Pad, LegalWide, 8 12 X 11, WhiAmpad EZ Flag Writing Pad, LegalWide, 8 12 X 11, Whi
Ampad EZ Flag Writing Pad, LegalWide, 8 12 X 11, Whi
 
The Federalist Papers Writers Nozna.Net. Online assignment writing service.
The Federalist Papers Writers Nozna.Net. Online assignment writing service.The Federalist Papers Writers Nozna.Net. Online assignment writing service.
The Federalist Papers Writers Nozna.Net. Online assignment writing service.
 
Whoever Said That Money CanT Buy Happiness, Simply DidnT
Whoever Said That Money CanT Buy Happiness, Simply DidnTWhoever Said That Money CanT Buy Happiness, Simply DidnT
Whoever Said That Money CanT Buy Happiness, Simply DidnT
 
How To Write An Essay In College Odessa Howtowrit
How To Write An Essay In College Odessa HowtowritHow To Write An Essay In College Odessa Howtowrit
How To Write An Essay In College Odessa Howtowrit
 
How To Write A Career Research Paper. Online assignment writing service.
How To Write A Career Research Paper. Online assignment writing service.How To Write A Career Research Paper. Online assignment writing service.
How To Write A Career Research Paper. Online assignment writing service.
 
Columbia College Chicago Notable Alumni - INFOLEARNERS
Columbia College Chicago Notable Alumni - INFOLEARNERSColumbia College Chicago Notable Alumni - INFOLEARNERS
Columbia College Chicago Notable Alumni - INFOLEARNERS
 
001 P1 Accounting Essay Thatsnotus. Online assignment writing service.
001 P1 Accounting Essay Thatsnotus. Online assignment writing service.001 P1 Accounting Essay Thatsnotus. Online assignment writing service.
001 P1 Accounting Essay Thatsnotus. Online assignment writing service.
 
Essay Writing Tips That Will Make Col. Online assignment writing service.
Essay Writing Tips That Will Make Col. Online assignment writing service.Essay Writing Tips That Will Make Col. Online assignment writing service.
Essay Writing Tips That Will Make Col. Online assignment writing service.
 
Pin On Essay Writer Box. Online assignment writing service.
Pin On Essay Writer Box. Online assignment writing service.Pin On Essay Writer Box. Online assignment writing service.
Pin On Essay Writer Box. Online assignment writing service.
 
How To Write A Funny Essay For College - Ai
How To Write A Funny Essay For College - AiHow To Write A Funny Essay For College - Ai
How To Write A Funny Essay For College - Ai
 

Recently uploaded

Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Meghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentMeghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentInMediaRes1
 
Final demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxFinal demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxAvyJaneVismanos
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxEyham Joco
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 
MARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized GroupMARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized GroupJonathanParaisoCruz
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfSumit Tiwari
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfUjwalaBharambe
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfMahmoud M. Sallam
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaVirag Sontakke
 
Capitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitolTechU
 

Recently uploaded (20)

Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...
 
Meghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentMeghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media Component
 
Final demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxFinal demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptx
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptx
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 
MARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized GroupMARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized Group
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdf
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of India
 
Capitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptx
 

A Conceptual Framework For Scaffolding III-Structured Problem-Solving Processes Using Question Prompts And Peer Interactions

  • 1. Scaffolding Students’ Problem-Solving Processes in an Ill-Structured Task Using Question Prompts and Peer Interactions Xun Ge Susan M. Land This study examined the effects of question prompts and peer interactions in scaffolding undergraduate students’ problem-solving processes in an ill-structured task in problem representation, developing solutions, making justifications, and monitoring and evaluating. A quasi-experimental study, supplemented by multiple-case studies, was conducted to investigate both the outcomes and the processes of student problem-solving performance. The quantitative outcomes revealed that question prompts had significantly positive effects on student problem-solving performance but peer interactions did not show significant effects. The qualitative findings, however, did indicate some positive effects of peer interactions in facilitating cognitive thinking and metacognitive skills. The study suggests that the peer interaction process itself must be guided and monitored with various strategies, including question prompts, in order to maximize its benefits. Many researchers (e. g., Bransford, Brown, & Cocking, 2000; Bransford & Stein, 1993; Jonassen, 1997) have emphasized the importance of engag- ing students in complex, ill-structured problem- solving tasks, which are intended to help students see the meaningfulness and relevance of what they learn and to facilitate transfer by con- textualizing knowledge in authentic situations. Yet previous research has pointed to student deficiencies in problem solving, for instance, a failure to apply knowledge from one context to another (Gick, 1986; Gick & Holyoak, 1980), espe- cially when solving ill-structured problems (Fel- tovich, Spiro, Coulson, & Feltovich, 1996). Students’ difficulties in problem solving have been attributed to both limited domain and metacognitive knowledge (Brown, 1987). According to Vygotsky (1978), learners should be guided or scaffolded by a “more capable peer” to solve a problem or carry out a task that would be beyond what they could ac- complish independently (p. 86). The notion of scaffolding has traditionally emphasized the role of dialogue and social interaction to foster comprehension-monitoring strategies (see e.g., reciprocal teaching and peer-regulated learning, Palincsar & Brown, 1984; Palincsar, Brown, & Martin, 1987); however, externalized support during problem solving has also been ac- complished through strategies such as modeling (Schoenfeld, 1985), prompting (Scardamalia, Bereiter, McLean, Swallow, & Woodruff, 1989; Scardamalia, Bereiter, & Steinbach, 1984), and guided, student-generated questioning (King, 1991). Such strategies have been found to be ef- fective in fostering comprehension, monitoring, ETR&D, Vol. 51, No. 1, 2003, pp. 21–38 ISSN 1042–1629 21
  • 2. problem solving (e.g., Palincsar & Brown, 1984; Scardamalia et al., 1989), and reflective thinking (Lin, Hmelo, Kinzer, & Secules, 1999). However, previous research has seldom in- vestigated the effectiveness of those strategies in supporting ill-structured problem solving, a process that characterizes the types of complex problems that we encounter in everyday life. Ill- structured problems have vaguely defined or unclear goals (Voss & Post, 1988), and the infor- mation needed to solve them is not entirely con- tained in the problem statements (Chi & Glaser, 1985). A problem qualifies as ill defined if any one of the three components (an initial state, operators, and a goal state) is not well specified (Chi & Glaser). We believed that scaffolding strategies could be adapted to support students’ cognitive and metacognitive skills during ill- structured problem solving. In this study, we were specifically interested in examining the ef- fects of question prompts and peer interactions to scaffold novice learners’ problem-solving processes in an ill-structured task. Theoretical Background Ill-Structured Problem-solving Processes Ill-structured problems are defined as having vague goals (Chi & Glaser, 1985; Voss & Post, 1988) that permit multiple solutions or solution paths (Kitchner, 1983). By contrast, well-struc- tured problems have single solutions, optimal solution paths, and structured goals (Sinnott, 1989). Solving well-structured problems nor- mally involves representing the problem, sear- ching for solutions, and implementing solutions (Gick, 1986). However, because of the nature of an ill-structured problem, its solution process is different from that of a well-structured problem. Problem representation, justification skills, monitoring, and evaluation are the primary re- quirements for ill-structured problem solving (Sinnott; Voss & Post). According to Voss and Post, problem representation involves examin- ing concepts and relations of a problem, isolat- ing the major factors causing the problem and its constraints, and recognizing divergent perspec- tives. Once a problem is represented, solutions can be derived by finding ways to eliminate the causes of the problem and then developing cor- responding procedures for implementing them. Since a problem solver must select a good solu- tion from among many, he or she must generate a viable, defensible, and cogent argument to sup- port the problem solution (Jonassen, 1997; Voss & Post); thus justification skills are paramount to solving ill-structured problems (Jonassen; Kitchner & King, 1981). In addition, the problem solver must evaluate his or her solution by ex- amining and defending it against other alterna- tives. Hence, monitoring and evaluation are required throughout the process, from identify- ing the essence of the problem to selecting the best goal for solving it (Sinnott). Learners monitor their own processes and movements from state to state, and select information, solu- tions, and emotional reactions (Sinnott). Cognition and Metacognition in Solving Ill-Structured Problems Solving ill-structured problems requires domain-specific knowledge (Voss & Post, 1988; Voss, Wolfe, Lawrence, & Engle, 1991) as well as structural knowledge (Jonassen, Beissner, & Yacci, 1993). Domain-specific knowledge is con- tent knowledge consisting of cognitive com- ponents such as propositional information, concepts, rules, and principles. Structural knowledge is knowledge of how concepts within a domain are interrelated and requires integration of declarative knowledge into useful knowledge structures (Jonassen et al.). How- ever, in the absence of domain-specific knowledge and structural knowledge, metacog- nition, which involves both knowledge and regulation of cognition (Pressley & McCormick, 1987), is necessary for solving ill-structured problems. Chi, Bassok, Lewis, Reimann, and Glaser (1989) found that successful learners tend to generate more working explanations, par- ticularly in response to an awareness of limited understanding. Wineburg (1998) found that metacognitive knowledge can compensate for absence of relevant domain knowledge when metacognitive awareness leads to recognizing areas of limited understanding, adopting work- ing hypotheses, asking questions, monitoring thinking, and revisiting early interpretations. 22 ETR&D, Vol. 51, No. 1
  • 3. Question Prompts as a Scaffolding Strategy Question prompts have been found effective to help students focus attention and monitor their learning through elaboration on the questions asked (Rosenshine, Meister, & Chapman, 1996). Scardamalia et al. (1984) first used procedural prompts, such as “An example of this . . .” and “Another reason that is good. . .,” to scaffold learners with specific procedures or suggestions to help them plan their writing. Later, King (1991, 1992, 1994) provided students with strategy-questioning prompt cards to teach them how to make inferences and generaliza- tions and to ask for and provide task-ap- propriate elaboration. In one study, King (1991) specifically emphasized the role of question prompts in scaffolding metacognition. She grouped questions into three metacognitive categories: planning, monitoring, and evalua- tion, which closely paralleled the general prob- lem-solving model (problem identification, searching for a solution, implementation of a solution, and evaluation, Bransford & Stein, 1993; Gick, 1986). Questions such as “What is the problem?” and “What do we know about the problem so far?” were asked to help students with planning. Recently, researchers have integrated prompts into computer-based instruction to facilitate metacognition (Davis & Linn, 2000; Hannafin, Land, & Oliver, 1999; Lin & Lehman, 1999). Lin and Lehman found that justification prompts facilitated transfer to a contextually dissimilar problem. Similarly, Davis and Linn found that self-monitoring prompts embedded in the Web-based knowledge integration en- vironment (KIE) encouraged students to think carefully about their activities and facilitated planning and reflection. Hence, we believed that question prompts could scaffold ill-structured problem solving by eliciting thoughtful respon- ses such as explanations and inferences (King & Rosenshine, 1993) and constructing cogent argu- ments (Kitchner & King, 1981). Peer Interaction as a Scaffolding Strategy Lin et al. (1999) argued that peer interaction sup- ports reflective social discourse, thereby helping learners to consider multiple points of views and select the best one based on evidence. Previous research (e.g., King, 1991; Palincsar et al., 1987; Webb, 1989) indicated that peer interaction could be an effective scaffolding strategy. Peer interaction can be guided or unguided. Guided peer interaction is typically modeled by a teacher with specific instructions, such as Palinc- sar and Brown’s (1984) reciprocal teaching, in which a teacher initially models key activities such as summarizing, questioning, predicting, and clarifying, and then both the teacher and the student take turns leading a dialogue. Addition- al examples are Palincsar et al.’s peer modeling process, whereby seventh-graders were taught to be tutors to their same-age tutees, and King’s studies (e.g., 1991, 1992, 1994), which focused on guiding students to generate questions and elaborate thinking during the peer interaction process. A developed body of research on cooperative learning in the ’80s and early ’90s (e.g., Johnson, Johnson, & Stanne, 1985, 1986; Johnson, Johnson, Stanne, & Garibaldi, 1990) also revealed the success of guided peer interac- tions in improving student performance and achievement by employing various group processing strategies. The peer interaction described in Webb’s (1989) study was not specifically guided. It was charac- terized by small groups of students who were given materials to learn or a problem to solve and expected to help each other learn the material. They were not given specific roles, although they may have had different abilities and background experiences (Webb, 1989). Thus, their interaction was contingent on voluntary engagement and commitment to peer learning. Webb (1982, 1989) found that, when learners were required to give explanations to and ask questions of each other, learning was enhanced. Similarly, Greene and Land (2000) found that peer interaction during open-ended learning was effective when group members offered suggestions, negotiated ideas, and shared their experiences. The process of ex- planation presumably requires learners to clarify concepts, reorganize thinking, and reconceptual- ize the material. In the present study, both guided (with question prompts) and unguided peer inter- actions were studied to examine if they had dif- ferential effects in facilitating ill-structured problem solving. SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 23
  • 4. Purpose of the Study Despite the justification for the use of question prompts to facilitate problem-solving activities, the relationship between questioning strategies and ill-structured problem solving has not been sufficiently studied. A review by Rosenshine et al. (1996) revealed that a majority of the studies in the area of questioning strategies was focused on activating prior knowledge and improving comprehension. King (1991) studied the effects of guided, student-generated questions during peer interactions on metacognitive skills, knowledge construction, and problem solving. However, in King’s (1991) study, the problem- solving task was well structured and the sub- jects were children. Our study aimed to extend King’s research on questioning strategies to the context of ill-structured problem solving, and with an adult population (i.e., college students). The purpose of this study was to investigate the effects of (a) question prompts, (b) peer inter- actions, and (c) the combined strategies of ques- tion prompts and peer interactions in scaffolding undergraduate students’ problem- solving processes in an ill-structured task. The problem-solving outcomes and processes inves- tigated were (a) problem representation, (b) problem solution, (c) making justifications, and (d) monitoring and evaluation, which feature the major processes of ill-structured problem solving (e.g., Jonassen, 1997; Sinnott, 1989; Voss, 1988; Voss & Post, 1988). The question prompts in this study referred to a set of questions that were domain specific and metacognitive-like, prompting students to attend to important aspects of a problem at dif- ferent phases and assisting them to plan, monitor, and evaluate the solution process. They were thus categorized into different functional types, which closely paralleled the four proces- ses of ill-structured problem solving. For ex- ample, paralleling the process of monitoring and evaluation were a series of questions asked under the category, “Am I on the right track?” The question prompts were delivered either in printed format or through the Web. The peer interaction strategy under inves- tigation is defined as small groups of three or four students who were given an ill-defined problem and told to collaborate in solving it. Students were not assigned specific roles. They were expected to engage in the problem-solving task and actively interact with each other to negotiate meaning, share knowledge, and develop solutions. There were two versions of peer interactions: guided (with question prompts) versus unguided. The study examined the following questions: 1. Does using question prompts and peer inter- actions separately or in combination affect student problem-solving processes (problem representation, solution development, jus- tification, and monitoring and evaluation of solutions) in an ill-structured task? 2. Does using question prompts and peer inter- actions separately or in combination in- fluence student cognition and metacognition in the process of developing solutions to ill- structured problems? The following hypotheses were generated from Question 1 and were tested: Hypothesis 1. Students receiving question prompts will demonstrate better problem-solving performance in an ill-structured task than those who do not receive question prompts in problem representation, solution development, justifica- tion, and monitoring and evaluation of solutions. Previous research has shown that question prompts can facilitate explanation construction (King, 1991, 1992; King & Rosenshine, 1993), plan- ning, monitoring, and evaluation (Davis & Linn, 2000; King, 1991; Schoenfeld, 1985), and making justifications (Lin & Lehman, 1999). Hypothesis 2. Students working with peers will demonstrate better problem-solving perfor- mance in an ill-structured task than those work- ing individually in problem representation, solution development, justification, and moni- toring and evaluation of solutions. Peer model- ing and interaction have been found to facilitate self-regulation (Brown & Palincsar, 1989), dis- tribute expertise, and foster reflection on multiple perspectives (e.g., Roschelle, 1992; Webb, 1989). Hypothesis 3. Students working with peers and also receiving question prompts will demon- strate better problem-solving performance in an ill-structured task than all the other treatment 24 ETR&D, Vol. 51, No. 1
  • 5. groups in problem representation, solution development, justification, and monitoring and evaluation of solutions. Previous research has shown that guided peer interaction is more ef- fective than unguided peer interaction (e.g, Johnson et al., 1990; King, 1991; King & Rosen- shine, 1993; Palincsar et al., 1987). METHOD Design A quasi-experimental study, supplemented by comparative, multiple-case studies, was em- ployed to investigate the two research questions. According to Greene, Caracelli, and Graham (1989), using both quantitative and qualitative methods helps a researcher to seek triangulation of the results from different data sources; ex- amine overlapping and different facets of a phenomenon; discover paradoxes, contradic- tions, and fresh perspectives; and expand the scope and breadth of a study. The quasi-ex- perimental study, designed to answer Research Question 1, was conducted to measure students’ problem-solving outcomes in an ill-structured task in the four problem-solving processes: (a) problem representation, (b) solution develop- ment, (c) justification, and (d) monitoring and evaluation of solutions. The comparative, multi- ple-case studies (Yin, 1989) served two pur- poses: (a) to supplement and explain findings for Research Question 1 and (b) to explore Re- search Question 2 to gain insights into students’ problem-solving processes through think-aloud protocols, interviews, and observations. Participants and Context of the Study Participants in the quasi-experimental design were 117 undergraduate students recruited from three class sections of an introductory course in information sciences and technology (IST) at a major university in northeastern United States. Of these, 19 also participated in the comparative, multiple-case studies. Most of the students were freshmen majoring in infor- mation sciences and technology, with a few stu- dents from other majors. The course was designed not only to provide an overview of information sciences and tech- nology, but also to integrate collaborative learn- ing and problem-solving skills. It consisted of both lecture and laboratory sessions. There were two lecture sessions and one laboratory session per week. The 75 min lecture sessions were held by a professor of information sciences and tech- nology. The 115-min laboratory session was con- ducted by a teaching assistant. Each of the three class sections was taught by a different profes- sor. All were equally experienced in teaching the subject. Two teaching assistants taught the laboratory sections, one being the first author of this study, who conducted the labs for one of the class sections. All three class sections shared a common curriculum and a core textbook, with approximately 50 students in each. The primary purpose of the laboratory ses- sions was to provide hands-on experience and technology skills related to information sciences and technology. There were two major goals of the laboratory sessions: (a) developing basic in- formation technology skills through skill module exercises (e.g., spreadsheets and database management systems); and (b) developing problem-solving and collaborative learning skills through case studies. The Quasi-Experimental Study The four conditions of the quasi-experimental study were (a) peer-question (PQ), (b) peer-con- trol (PC), (c) individual-question (IQ), and (d) in- dividual-control (IC). We measured students’ problem-solving performance in an ill-struc- tured task, the output of which was a problem- solving report. Sampling and Treatment Assignment In order to study students’ problem-solving per- formance in the natural setting of the classroom, the study was integrated into the curriculum and administered during a 115-min laboratory session. Each class section was randomly as- signed as an intact group to one or two of the treatment conditions. Because there were only three classes to be used for four different treat- ment conditions, one class had to be split into two conditions. Fifteen participants in the IQ SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 25
  • 6. condition and 16 participants in the IC condition were randomly assigned from Class A. Thirteen groups comprising 48 participants in Class B were assigned to the PQcondition while11 groups total- ing 38 students in Class C were assigned to the PC condition. Those groups were preexisting and pre- viously formed by the course professors for the class projects. The normal size of each group was 4 students; however, due to attrition and absence, some variations in group size occurred, resulting in some 3-member groups. In the PQ condition there were nine 4-member groups and four 3- member groups; in the PC condition there were five 4-member groups and six 3-member groups. While the uneven distribution of group sizes across the two conditions might be a concern, ac- cording to Lou, Abrami, and d’Apollonia’s (2001) literature review, small groups of 3 to 4 members were more effective than larger groups,which sug- gested little difference between 3-member and 4- member groups. Because of various constraints when this study was conducted, pretest data to establish equivalence of the three intact groups were not available. Instead, a brief survey was conducted at the end of the study, which provided useful infor- mation on the participants’ profiles and prior prob- lem-solving experience across different conditions. In the IQ condition, there were 14 (93%) IST major students and 1 (7%) nonmajor student. In the IC condition, there were 14 (87.5%) IST major students and 2 (12.5%) nonmajor students. In the PQ condi- tion, the IST major students were 43 (90%) as com- pared with 5 (10%) nonmajor students (3 of them majoring in related fields, such as computer science). The PC condition consisted of 31 (82%)IST major students as compared with 7(18%)nonmajor students. More than 70% of students in the IQ, IC, andPQconditions and about 60% ofthe students in the PC condition reported that they had some kind of previous problem-solving experience. The self- rated problem-solving skills across the four condi- tions were statistically analyzed and did not indicate significant differences. Measurement and Treatment Material The ill-struc- tured problem-solving task material was a complex, real-world problem related to the domain of information science and technol- ogy and developed by a course professor. The materials were then validated by other IST professors based on the major attributes of an ill- structured problem (e.g., Chi & Glaser, 1985; Jonassen, 1997; Kitchner, 1983; Sinnott, 1989; Voss, 1988; Voss & Post, 1988). The problem scenario for the task, as presented below, was ill- structured in nature because subgoals were not clear and the students had to generate and define them. Additionally, the information needed to solve the problem was not entirely contained in the problem, the operators (actions to be taken to solve the problems) were not defined, and multiple solutions were possible. Many customers complain that they have difficulty finding items in a large supermarket (the W Store). This problem especially affects college students, who often have very little time for shopping. Since students are major customers in this small college town, the manager of the local store has hired you (or your team) as a consultant to propose information technology- based solutions to the problem. Your task is to make suggestions about the features to be included in a new information system. As part of this, you are to develop a simple model illustrating your proposed system. Based on the findings of a survey, the proposed infor- mation system should be able to help customers find items quickly, to present an overall view of all the items on a shelf and an aisle, and to map out the shortest route for getting all the items a customer needs to purchase. There may be some other important factors you need to consider. Students across all the four conditions were instructed to analyze the problem, propose in- formation technology solutions, support their solutions with evidence, and evaluate their solu- tions. The output of the task was a two- to three- page solution report, accompanied with a diagram of their proposed system. In addition, the students were asked to produce a prototype of the database system described in their solu- tion reports in order to satisfy the laboratory re- quirement. Task performance was measured based on assigned conditions instead of individual learn- ing outcomes. In other words, individual stu- dents were measured according to their individual solution reports, while groups were measured as collective units according to their group solution reports. The comparison of the individual reports and the group collective 26 ETR&D, Vol. 51, No. 1
  • 7. reports was made because this study was focused, not on measuring individual learning outcomes with different treatments, but rather on student performance in different grouping contexts: individual versus groups. The question-prompt treatment material (Appendix A) was a list of 10 major questions generated from the problem by the course professors, the IST experts. The question prompts were then organized and categorized into four types: (a) problem representation prompts, (b) solution prompts, (c) justification prompts, and (d) monitoring and evaluation prompts. Each category of prompt included some subquestions. For instance, included in the category of problem representation prompts (How do I define the problem?) were subques- tions such as What are the parts of the problem? and What are the technical components of the problem? An analytical rubric system developed by the researchers was used to evaluate students’ prob- lem-solving reports. It was based on the theoretical framework of ill-structured problem solving (e.g., Chi & Glaser, 1985; Jonassen, 1997; Kitchner, 1983; Sinnott, 1989; Voss, 1988; Voss & Post, 1988), and was reviewed and validated by both the course professors and some experts in the area of rubrics development. The rubrics were modified and revised based on feedback before being finalized. The rubric system had four major constructs, each measuring one of the four problem-solving processes in an ill-struc- tured task: (a) problem representation, (b) developing solutions, (c) making justification for generating or selecting solutions, and (d) monitoring and evaluating the problem space and solutions. Each construct embodied specific attributes, with performance specifications, criteria, and ordinal values on different point scales, such as 0–1–2–3 or 0–2–4. For instance, the construct, making justifications, was evaluated by two specific attributes: (a) con- structing argument (ranging on a scale of 0–2–4), and (b) providing evidence (ranging on a scale of 0–1–2–3). In evaluating providing evidence, 0 was assigned if no evidence was provided, 1 was assigned if evidence provided was not plausible, 2 was assigned if evidence was based on hypothetical examples, and 3 was assigned if evidence was based on previous experience or real examples. In evaluat- ing constructing argument, 0 was assigned if no argument was constructed, 2 was assigned if an ar- gument was poorly constructed, and 4 was as- signed if an argument was well constructed. Because constructing argument was an im- portant attribute, a 0–2–4 scale instead of a continuous scale (i.e., 0–1–2) was used to dif- ferentiate distinctively the students who failed to provide an argument from those who provided minimal or weak arguments, and those who provided sound and cogent arguments. The earned points for both con- structing argument and providing evidence were summated on a range of 0–7 points to give an overall score for the construct, making justification. Administering the Study Sessions The experimental study was administered by the first author and a colleague in three 115-min laboratory sessions in the same week. All the study sessions were conducted in a classroom equipped with laptop computers and LCD projectors, where the participants had regular lectures and lab sessions. In the first study ses- sion, administered to the PQ condition, the par- ticipants were told to work on the problem-solving task in their preassigned groups. The problem task materials and the question prompts were posted on the course Web site, to which students had access during the session. At the same time, they were provided with duplicate materials in paper for- mat as a backup measure for any unexpected technological problems. The students in this condition were frequently reminded to refer to the question prompts while solving the prob- lem. In the second study session, administered to the PC condition, students were also told to work on the problem-solving task in their preas- signed groups. The problem-solving task material was delivered to them in the same way as to the PQ group, but they were not provided with question prompts. The third study session was conducted with the IQ and IC conditions in the same class section. These participants had been randomly and previously assigned to either the IQ or the IC condition, and they were SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 27
  • 8. seated on opposite sides of the room. The first author passed out the problem-solving task in handouts (together with question prompts for the IQ condition), which were color-coded for the two different conditions. The participants were instructed to work individually. Throughout all the study sessions, the re- searcher attended to students’ questions that re- lated to procedures or requirements for the study only. No hints or assistance associated with the problem were provided. Quantitative Data Analysis Three raters, including the first author, evaluated the problem-solving reports. Before evaluating, they reached a conceptual consensus on how to interpret the scoring rubrics through discussion and examples. The first author evaluated all the reports; the other two raters evaluated 70% of the reports. Any discrepancies of assigned values were discussed among the raters and the adjudicated score was used. Con- sequently, a high consensus was reached. A 2 × 2 multivariate analysis of variance (MANOVA) was conducted to examine the ef- fects of question prompts and peer interactions, as well as the interactive effect of question prompts and peer interactions. Wilks’s Lambda F (Îą = .05) was used in interpreting the multi- variate test results. The use of MANOVA was justified because of an overall correlation among the four dependent variables (problem repre- sentation, developing solutions, making jus- tifications, and monitoring and evaluation) indicated by the results of Pearson’s correlation, which were significant at the .01 level. As shown by the results from the Box’s M Test and the Levene’s Test, the assumption of equal variance was met at the .05 alpha level, and thus met the MANOVA testing assumption that the residual errors follow a multivariate normal distribution in the population. All the analyses were done with the Statistical Package for the Social Scien- ces (SPSS 11.0 for Windows). The Comparative, Multiple-Case Studies In the comparative, multiple-case studies, a case was defined as an individual participant or a peer group. Four individual participants and four peer groups comprised eight separate cases. Selective (discriminative) sampling (Strauss & Corbin, 1998) was used to maximize the representation of cases (Stake, 2000) and the opportunities for comparative analysis across different conditions. Within each condition, par- ticipants were selected based on informed con- sent, level of verbal interaction (with peer conditions), and willingness to be audio taped or videotaped for think-aloud protocols, observa- tions, and interviews. Data Collection Techniques and Procedures Think- aloud protocols are the verbalization of one’s thinking process (Ericsson & Simon, 1996). In this study, they referred to an individual’s verbalizations while engaged in the problem- solving task. The verbal protocols were audio recorded and later transcribed verbatim. The first author administered the think-aloud ses- sions to the four individuals (two in the IQ con- dition and two in the IC condition) separately and independently from the other participants in the same condition. She demonstrated the think-aloud procedure through examples and made sure that the participant could follow the procedure before beginning to record. The first author occasionally reminded participants to talk out loud or to raise their voice. The observations were made on videotape during the experimental study session, and cap- tured both actions and verbalizations. The pur- pose for videotaping the cases was to gain more detailed understanding of the problems and processes experienced by learners during ill- structured problem solving and how the scaf- folds might have supported them in this process. The selected groups were observed together with the other groups of participants in the same classroom. The first author circulated about the room and took notes on interactions for both the PQ and the PC conditions. Videotapes of the problem-solving processes of the selected groups were later transcribed ver- batim and analyzed. Structured interview protocols, such as what? how? and why? were used to prompt stu- dents to recall their problem-solving processes 28 ETR&D, Vol. 51, No. 1
  • 9. and the effects of the question prompts and peer interactions; for example: • Would you please tell me how you solved the problem, in detail, for example, how you ap- proached the problem at first and how you came up with solutions? • What were your reasons for selecting those solutions? • Did you find that the question provided was helpful? In what ways? Please give examples. • Did the group help you to solve this prob- lem? How? Please give examples. Except for one group, in which a member did not want to be videotaped (and instead was audio recorded), all the interviews with the peer conditions were videotaped. All the interviews with individual participants in the IQ and the IC conditions were audio taped. The interview ses- sions lasted approximately 30–40 min. Qualitative Data Analysis Pseudonyms were used for the eight selected cases to protect the identity of the participants. All the audio taped and videotaped data from the think- aloud protocols, observations, and interviews were transcribed for data analysis. Miles and Huberman’s (1994) data analysis model, which in- volves data reduction, data display, and conclusion drawing and verification, was used to guide the qualitative data analysis.Thedata analysis primari- ly consisted of the following steps: reading and jot- ting marginal notes on the transcripts; identifying patterns and labeling concepts; organizing labeled concepts into data display matrixes; identifying themes; and drawing conclusions. For example, the constructs and attributes of the rubrics were used for labeling to examine the participants’ perfor- mance in each of the problem-solving processes, and new concepts were generated to examine stu- dent behavior, reaction, and cognitive and metacognitive process in the context of question prompting or peer interaction. The next procedure was to organize and display the labeled concepts so that comparison could be made across different cases and conditions. The data were displayed in different ways to be viewed from different dimen- sions, for instance, different conditions were com- pared according to the four problem-solving processes, and student behaviors and reactions in peer or prompting conditions were organized with data display matrixes (see examples in Ap- pendix B). Finally, conclusions were drawn, supported by examples. RESULTS Quantitative Outcomes Table 1 summarizes the descriptive statistics for the four problem-solving processes (the depend- ent variables) by two factors: (a) individuals vs. peers and (b) question prompts vs. no question prompts. The n for peers in the table indicated the number of groups, each of which consisted of three to four students. The results of the eight selected cases were also included in the data analysis. Below is the statistical analysis report in response to each of the hypotheses tested. Question Prompting Effects The hypothesis on question prompts predicted that students who received question prompts would perform significantly better than stu- dents who did not receive question prompts. The results of the two-way MANOVA revealed a significant main effect for question prompts, F (4, 48) = 17.371, p < .001, Ρ2 = .591, which supported the hypothesis that students who received question prompts would perform significantly better than students who did not receive question prompts. Further, a univariate test of between-subjects effects revealed sig- nificant effects of question prompts in all the four problem-solving processes—problem rep- resentation, F (1, 51) = 51.051, p < .001, MSE = 2.227, Ρ2 = 0.500; generating solutions, F (1, 51) = 21.429, p < .001, MSE = .960, Ρ2 = .296; making justification, F (1, 51) = 32.929, p < .001, MSE = 1.424, Ρ2 = .392; and monitoring and evaluation, F (1, 51) = 21.336, p < .001, MSE = 3.658, Ρ2 = .295. Table 1 shows means and standard deviations of question prompt treatment groups in com- parison with control groups (no-question- prompt conditions), indicating that students who received question prompts significantly out- performed those who did not receive question prompts in all four problem-solving processes. SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 29
  • 10. Peer Interaction Effects The hypothesis on peer interactions predicted that students working with peers would per- form significantly better than those working in- dividually in problem-solving processes. The results of the two-way MANOVA did not reveal significant effects for peer interactions, F (4, 48) = 2.308, p = .071, Ρ2 = .161, and thus failed to sup- port the hypothesis. However, as shown by Table 1, the total mean score of the peers condi- tion is much higher than that of the individuals condition in problem representation. To further explore the trend shown in the means, we ran a post hoc univariate test. The result showed that the peers significantly outperformed the in- dividuals in problem representation, F (1, 51) = 6.991, p = .011, MSE = 2.227, Ρ2 = .121. Interactive Effects of Question Prompts and Peer Interactions It was hypothesized that students working with peers and also receiving question prompts would demonstrate significantly better prob- lem-solving skills than all the other conditions. However, the result did not show significant in- teractive effect of the two strategies, F (4, 48) = 1.298, p = .284, Ρ2 = .098, and thus failed to sup- port the hypothesis. Despite the result, Table 1 showed that there was a trend that the PQ condi- tion had higher means than the other conditions (IQ, IC, and PC) in problem representation and generating solutions. Qualitative Findings The eight cases selected for in-depth qualitative study provided us with further insights into the participants’ problem-solving performance in different conditions to supplement the quantita- tive findings. Below is a brief report of the per- formance of the cases on the problem-solving report, followed by a summary of the qualitative findings on the effects of the question prompts and the peer interactions. Overall performance of cases on problem-solving reports. Table 2 presents the raw scores of the cases on the solution report. In general, the cases in the question-prompt conditions (IQ and PQ) showed higher raw scores than those in the non- question-prompt conditions (IC and PC) in the Table 1 Means and standard deviations for the problem-solving processes by question prompts vs. no question prompts and individuals vs. peers. Individuals Peers Total M SD n M SD n M Sd n Problem Representation Question Prompts 4.47 (1.55) 15 6.23 (1.74) 13 5.29 (1.84) 28 No Question Prompts 2.25 (1.13) 16 2.64 (1.57) 11 2.41 (1.31) 27 Total 3.32 (1.74) 31 4.58 (2.45) 24 3.87 (2.15) 55 Generate Solutions Question Prompts 6.13 (0.83) 15 7.08 (0.95) 13 6.57 (1.00) 28 No Question Prompts 5.38 (0.89) 16 5.36 (1.29) 11 5.37 (1.04) 27 Total 5.74 (0.93) 31 6.29 (1.40) 24 5.98 (1.18) 55 Make Justification Question Prompts 5.00 (1.20) 15 5.54 (1.27) 13 5.25 (1.24) 28 No Question Prompts 3.63 (1.31) 16 3.18 (0.87) 11 3.44 (1.15) 27 Total 4.29 (1.42) 31 4.46 (1.61) 24 4.36 (1.50) 55 Monitor & Evaluate Question Prompts 4.20 (2.11) 15 4.31 (2.18) 13 4.25 (2.10) 28 No Question Prompts 1.88 (1.59) 16 1.82 (1.72) 11 1.85 (1.61) 27 Total 3.00 (2.18) 31 3.17 (2.32) 24 3.07 (2.22) 55 Note. The possible ranges of scores for Problem Representation, Generating Solutions, Making Justification, and Monitoring and Evaluating are 0–10, 0–8, 0–7, and 0–7 respectively. 30 ETR&D, Vol. 51, No. 1
  • 11. four problem-solving processes; the cases in the peer condition without question prompts (PC) did not show advantages over the individual conditions (IQ and IC) in any of the problem- solving processes. The PQ condition showed the best performance in problem representation and generating solutions. Effects of the Question Prompts Compared with the no-question-prompt condi- tions, the students who received question prompts engaged in the following cognitive and metacognitive activities: (a) making intentional efforts to identify factors, information, and con- straints during the problem-representation process; (b) organizing and planning for the solution process and articulating solutions ex- plicitly; (c) constructing arguments grounded in factors identified during problem representation and providing justification for each suggestion proposed; and (d) intentionally evaluating the selected solutions, comparing alternatives, and justifying the most viable solution. Because of space limitations, the examples presented below are selective and representative. The qualitative results showed that the ques- tion prompts had an effect of directing student attention to important information they might have overlooked, thus facilitating awareness of what is known and not known. For example, in their think-aloud protocols, both Cathy (Case 1, IQ) and Joe (Case 2, IQ) were prompted by the questions to seek additional information and identify important factors that helped them rep- resent the problem space and make connections among different factors and constraints. By com- parison, Case 7 (PC) was observed to start the solution discussion right away while Case 8 (PC) was observed to have some initial discussion of the problem; however, both groups failed to ad- dress problem subgoals, factors, and constraints in their report. The question prompts seemed to help students to analyze the problem and repre- sent the problem space. Consistent with Lin and colleagues’ (1999) notion of process modeling, the question prompts might have also served as expert modeling to guide students through the prob- lem-solving process. Joe (Case 2, IQ) said that the question prompts were helpful for him to or- ganize his thoughts. Perry (Case 6, PQ) men- tioned that the problem seemed vague at first, but the question prompts served as guidelines to help his group break down the problem into small steps. Without question prompts, the stu- dents seemed to have difficulty representing the problem and developing solutions. As sug- gested by Paul in the interview (Case 3, IC), he had difficulty making connections between dif- ferent parts of the problem and organizing the information coherently. It is well known that novices organize knowledge differently from ex- perts (Chi, Feltovich, & Glaser, 1981; Gick, 1986), which suggested a need to help students engage the problem more deeply than their limited knowledge structures might normally permit. It was observed that the question prompts also helped the students to state their reasons for their proposed solutions and make their think- ing visible. As shown by the think-aloud protocols, justification prompts such as What should the system do? and What are my reasons Table 2 The raw scores of the cases on their problem-solving reports. Individual-Question Individual-Control Peer-Question Peer-Control Case 1 Case 2 Case 3 Case 4 Case 5 Case 6 Case 7 Case 8 Problem Representation 3 3 1 2 8 8 2 1 (0–10 points) Developing Solutions 7 6 6 5 8 8 6 6 (0–8 points) Making Justification 6 6 4 5 6 6 1 4 (0–7 points) Monitoring & Evaluating 4 7 2 3 3 7 1 3 (0–7 points) SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 31
  • 12. for the solutions? prompted Joe (IQ) and Cathy (IQ) to articulate why they had selected par- ticular solutions. The students in Case 6 (PQ) pointed out that the justification prompts helped them to clarify, justify, and write down the reasons for their solutions that might not have been made explicit otherwise. In contrast, Paul (IC) mainly described how his proposed tech- nological system would work instead of justify- ing his solution, as observed in both his report and his think-aloud protocols. Presumably, prompting learners to articulate their thinking helps them become more aware of what they know, which then makes their thinking avail- able to them for reflection, monitoring, and revision (Scardamalia et al., 1989). Additionally, as indicated by Cases 1 and 2 (IQ), and 6 (PQ), the monitoring and evaluation prompts helped students think about alternative solutions and their viability, an aspect often over- looked by novice problem solvers (Feltovich et al., 1996). As reported by Joe (IQ), the prompts helped him to think about side effects that he would not have considered otherwise. Matt in Case 6 (PQ) mentioned in the interview that his group always went back to the main problem to make sure they were on the right track by following the questions; thus they were able not only to discuss the risks, pros, and cons of their proposed system, but alsoto make justifications for the viability of their solu- tions after comparing the alternatives. On the other hand, although Paul (IC)and Joanne in Case 7 (PC) mentioned in the interview that they had thought about or discussed the feasibility of the possible solutions, they failed to assess constraints and think about alternative solutions in their reports. Thus, the question prompts might have served as a metacognitive function to help students recognize a need to know and to evaluate the limitations of their solutions. To further support this conclusion, Case 5 (PQ) reportedly ignored the monitoring and evaluation prompts, which might explain why they failed to justify and evaluate their final solutions in their report. Effects of Peer Interactions The effects of peer interactions mainly involved (a) building upon each other’s ideas to develop solutions, (b) providing multiple perspectives, (c) asking questions and providing feedback, and (d) benefiting from distributed cognition. Interview data and field notes indicated that cases in peer conditions typically started the problem-solving process by brainstorming ideas, which were presented in the form of ques- tions or suggestions, such as How about. . .? What do you think. . .? That’s right, we can also. . . . Then, an idea was developed. The solution reports showed that the peer conditions tended to take into account a wider range of factors and information in generating or selecting solutions. Perry in Case 6 (PQ) mentioned in the interview that a topic “went back and forth for a while” before solutions were developed and selected. Cathy (IQ) said that she generated possibilities and then quickly narrowed them down to one solution, but Cases 5 and 6 (PQ) stated that they spent some time “figuring out” the best solution. The process of working together with peers to develop the best solution may have helped the students to construct multiple problem spaces, which is required for solving ill-structured problems (Jonassen, 1997). All the cases in the peer conditions men- tioned in their interviews that they were ex- posed to different inputs and perspectives from the peers. For example, the peers in Case 6 (PQ) were observed from field observations to spend considerable time brainstorming different ideas, weighing pros and cons of various solutions, and providing feedback to suggestions. In response, Matt (Case 6) said, “The group work made you see something you couldn’t see on your own.” As shown by the interview data, the other peer groups also shared the same ex- perience, such as Case 7 and 8 in the PC condi- tion. We surmise that the peer interaction process encouraged learners to identify alterna- tive views or perspectives on the problem (Jonassen, 1997), which in turn, helped them to select the most relevant and useful solutions (Sinnott, 1989). Evidence from observations and interviews showed that working collaboratively gave stu- dents a chance to ask questions, offer sugges- tions, elaborate thinking, and provide feedback, although some students did not use this oppor- tunity to offer critical comments. Mark (Case 5, PQ) said that his group members asked ques- 32 ETR&D, Vol. 51, No. 1
  • 13. tions and received feedback, which helped them to develop solutions. Students in Case 7 (PC) were seen discussing the needs, feasibility, pros and cons of the proposed information technology sys- tem through inquiry cycles of statements, ques- tions, elaboration or feedback. Such a cycle, called a “reflective toss” by van Zee and Minstrell (1997), helps students make their meanings clear and monitor their thinking process. At the same time, videotaped observations of Case 7 (PC) and Case 8 (PC) indicated that there tended to be more agreement than disagreement among the members, and few constructive sug- gestions were made to each other. With Case 8 (PC), one student, Andy, reportedly had prior classroom experience in solving a similar type of problem, and hence tended to dominate the group, and his peers tended to accept his sug- gestions. Field observations and videotapes showed that off-task joking and chatting took place in Case 8. These data suggested that work- ing together did not guarantee a positive peer interaction process that challenged every mem- ber to ask questions, elaborate thoughts, con- struct arguments, or provide suggestions. Another advantage of working with peers was benefiting from each other’s expertise. To il- lustrate, Bryan (Case 7, PC) was observed explain- ing to his peers a technical term, and Devin (Case 8, PC) said that he had learned a problem-solving strategy from Andy, his more experienced peer group member. The students in Case6 (PQ)shared their understanding about server and user inter- face when the group was trying to build the database prototype, and Perry (Case 6, PQ) pointed out that the group work had contributed to high-quality solutions because four heads provided more input than one head. During the in- terview, Case 5 (PQ) noted that an important benefit tothem was the variety ofideas and the dif- ferent areas of expertise that the peer interaction process afforded. From field observations, we found that the groups generally divided up their work according toeach student’s expertise, such as drafting documents, coming up with ideas, and developing a database prototype, especially at the latter part of the lab hours. These findings showed that cognition could be distributed and amplified across individuals with common goals and inter- ests (Pea, 1993; Perkins, 1993; Salomon, 1993). DISCUSSION Both the quantitative and qualitative findings on the effects of the question prompts support the hypothesis that question prompts can facilitate not only well-structured problem solving, as shown by the studies of Schoenfeld (1985) and King (1991), but also ill-structured problem solv- ing. The findings confirm the results of the pre- vious studies by King (e.g., 1989, 1991, 1992) and King & Rosenshine (1993) that structured guidance through questioning can enhance knowledge representation. They are also consis- tent with the research by Osman and Hannafin (1994) and Wong (1985) indicating that ques- tions could serve as cues to direct student atten- tion to important information that the students might have neglected. The effect of justification prompts supports Lin and Lehman’s (1999) find- ings that justification prompts directed student attention to understanding when, why, and how. The monitoring and evaluation prompts help students to think about alternative solu- tions and their viability, an aspect often over- looked by novice problem solvers (Feltovich et al., 1996). However, one concern with regard to our conclusions about the role of question prompts is that the prompts may have provided a strong advantage for students in the question-prompt conditions to perform well on the assessment, because the prompts focused on the same prob- lem-solving processes as the rubrics. This limita- tion of our study should be taken into consideration when developing question prompts and scoring rubrics for future work. Yet, we believe that the alignment of the ques- tion prompts and rubrics might serve as a guide for how prompts should map assessment, if prompting could work to help students achieve expected learning outcomes. The quantitative results of the students’ prob- lem-solving performance conflict with previous studies that showed positive effects of peer in- teractions in developing learner cognition and metacognition (Palincsar et al., 1987; Webb, 1982, 1989) and in improving performance and achievement (Johnson et al., 1985, 1986, 1990). However, the post hoc test indicating sig- nificantly better performance of the peer condi- SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 33
  • 14. tions over the individual conditions in problem representation suggests that peer interactions can be an effective scaffolding strategy under certain conditions. There might be several ex- planations for the results of peer interactions found in this study, including time constraints and short period of treatment. From field obser- vations, we found that most of the groups spent the first part of the lab time exploring the prob- lem space and brainstorming solutions; how- ever, their remaining time was spent distributing specific tasks among individuals, with little interaction and feedback. The more frequent peer interactions observed at the begin- ning of the lab session might explain why peer interaction conditions performed significantly better than individual conditions in problem representation. Another explanation might be that the group report measure was not sensitive enough to pick up the individual learning effects produced by the group processing. If that is the case, other measures are needed to effectively evaluate the achievement of individuals as a result of the peer interaction process. In addition, certain conditions might be re- quired for the peer interaction strategy to work fully to facilitate problem solving. Group mem- bers might not know how to ask productive questions or how to elaborate thoughts, espe- cially when their domain knowledge is limited (Land, 2000). As indicated by Webb and Palincsar (1996), in order for students to benefit from collaboration, they must request and pro- vide explanations, compare ideas, and sys- tematically use and evaluate evidence. Webb (1989) found that the students who learned most from peer interactions were those who asked for and provided explanations to others. King (1989) found that small groups that asked task- related questions and elaborated solutions were more successful at problem solving than groups that did not exhibit these behaviors. It was ob- served in our study that different group dynamics and compositions had different effects on the peer interaction process. For example, in Case 8, we saw that one student dominated the peer interactions by doing all the explanation while the other two students were relatively compliant. But, although collaboration was limited, these two students reported that they learned a lot from the dominant group member, a finding which again suggests a need for measuring individual achievement during the peer interaction process in future studies. Consistent with previous research, the qualitative data showed that students could benefit from peer interactions in several ways, such as building on each other’s ideas, eliciting responses or explanation (Webb, 1989), sharing multiple perspectives (Lin et al., 1999), and taking advantage of each other’s knowledge and competence (Pea, 1993). Thus, a remaining em- pirical question is how we can maximize the positive effects of peer interactions. Previous re- search highlights the prominent role of the in- structor in modeling effective comprehension monitoring (Greene & Land, 2000; Palincsar & Brown, 1984; Palincsar et al., 1987) and group- processing strategies (Johnson et al., 1985, 1986, 1990). With Johnson et al.’s (1990) group- processing strategy, for example, teachers and students provide a review of a group session and describe the member actions that were help- ful and not helpful and decide what action to continue or change. Our study indicated that simply placing students into groups unguided was not sufficient, and that providing students with question prompts might also be insufficient for effectively guiding interactions. As shown by Case 5, even though the PQ condition was provided with prompts, it still did not ensure that every group member used them produc- tively. Therefore, additional strategies, such as instructor modeling and monitoring, might also be needed to scaffold the processes of asking questions, elaborating, explaining, constructing arguments, providing constructive feedback, and monitoring. Despite the fact that the quantitative results failed to support our hypothesis that the com- bined use of question prompts and peer interac- tions was most effective in facilitating ill-structured problem-solving processes, the qualitative data did point to some guiding and modeling effects of the question prompts and the potential benefits of peer interaction. The fact that no main interactive effect was found might be due to the small sample size. As noted in the results section, there was atrendthat the PQ condi- tion had higher means in problem representation 34 ETR&D, Vol. 51, No. 1
  • 15. and generating solutions than the other condi- tions. Increasing the sample size might increase the statistical power that would result in a sig- nificant interaction. Another speculation is that question prompts might have effects in guiding individuals within a group through the prob- lem-solving process; however, it is unclear whether the question prompts helped the stu- dents to challenge each other with their own questions, elicit more explanations, and enhance argumentation and feedback. The question prompts might be useful to guide both in- dividuals and peers to focus on the questions provided, but they might be of limited support to help students generate questions of their own, or elaborate and clarify each other’s under- standing. This suggests that additional strategies, such as King’s (1991, 1992, 1994) question-generation and elaboration prompts, and Palincsar and Brown’s (1984) modeling of clarifying, predicting, and monitoring, are needed for peers to use question prompts effec- tively to maximize interactions through inter- pretation, elaboration, explanation, negotiation, and argumentation. Modified replication of the study is neces- sary, with increased number of treatments and sample size, random sample assignment, and use of pretest and posttest measures to ensure equivalence of different conditions. Individual achievement should be measured in addition to group performance. Future study should also investigate individual accountability in addition to group goals in the peer interaction process (see Slavin, 1989). In this study, we focused more on group goals than on individual account- ability, where the contribution of each member is identifiable. Other research efforts involve ex- amining the transfer effect of question prompts on student-generated questioning and its effect on ill-structured problem solving. This design would help overcome the drawback of question prompts providing advantages for the question- prompt treatment conditions. Xun Ge [xge@ou.edu] is Assistant Professor with the Instructional Psychology and Technology Program at The University of Oklahoma. Susan M. Land [sland@psu.edu] is Assistant Professor with the Instructional Systems Program at The Pennsylvania State University. REFERENCES Bransford, J.D., Brown, A.L., & Cocking, R.R. (Eds.). (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press. Bransford, J.D., & Stein, B.S. (1993). The IDEAL problem solver: A guide for improving thinking, learning, and creativity (2nd ed.). New York: W.H. Freeman and Company. Brown, A.L. (1987). Metacognition, executive control, self-regulation, and other more mysterious mechanisms. In F.E. Weinert & R.H. Kluwe (Eds.), Metacognition, motivation, and understanding (pp. 65– 116). Hillsdale, NJ: Lawrence Erlbaum Associates. Brown, A.L., & Palincsar, A.S. (1989). Guided, cooperative learning and individual knowledge ac- quisition. In L.B. Resnick (Ed.), Knowing, learning and instruction: Essays in honor of Robert Glaser (pp. 393– 451). Hillsdale, NJ: Lawrence Erlbaum Associates. Chi, M., Bassok, M., Lewis, M., Reimann, P., & Glaser, R. (1989). Self-explanations: How students study and use examples in learning to solve problems. Cognitive Science, 13, 145–182. Chi, M.T.H., Feltovich, P., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognition Science, 5, 121–152. Chi, M.T.H., & Glaser, R. (1985). Problem solving ability. In R.J. Sternberg (Ed.), Human abilities: An in- formation processing approach (pp. 227–250). New York: W.H. Freeman and Company. Davis, E.A., & Linn, M. (2000). Scaffolding students’ knowledge integration: Prompts for reflection in KIE. International Journal of Science Education, 22(8), 819–837. Ericsson, K.A., & Simon, H.A. (1996). Protocol analysis: Verbal reports as data revised edition. Cambridge, MA: Massachusetts Institute of Technology. Feltovich, P.J., Spiro, R.J., Coulson, R.L., & Feltovich, J. (1996). Collaboration within and among minds: Mastering complexity, individuality and in groups. In T. Koschmann (Ed.), CSCL: Theory and practice of an emerging paradigm (pp. 25–44). Mahwah, NJ: Lawrence Erlbaum Associates. Gick, M.L. (1986). Problem solving strategies. Educa- tional Psychologist, 21(1&2), 99–120. Gick, M.L., & Holyoak, K.J. (1980). Analogical problem solving. Cognitive Psychology, 12, 306–355. Greene, B.A., & Land, S.M. (2000). A qualitative analysis of scaffolding use in a resource-based learn- ing environment involving with the World Wide Web. Journal of Educational Computing Research, 23(2), 151–180. Greene, J.C., Caracelli, V.J., & Graham, W.F. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11, 255–274. Hannafin, M., Land, S., & Oliver, K. (1999). Open learning environments: Foundations, methods, and models. In C.M. Reigeluth (Ed.), Instructional-design theories and models: Vol. 2. A new paradigm of instruc- SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 35
  • 16. tional theory (pp. 115–140). Mahwah, NJ: Lawrence Erlbaum Associates. Johnson, R.T., Johnson, D.W., & Stanne, M.B. (1985). Effects of cooperative, competitive, and in- dividualistic goal structures on computer-assisted instruction. Journal of Educational Psychology, 77(6), 668–677. Johnson, R.T., Johnson, D.W., & Stanne, M.B. (1986). Comparison of computer-assisted cooperative, com- petitive, and individualistic learning. American Educational Research Journal, 23(3), 382–392. Johnson, D.W., Johnson, R.T., Stanne, M.B., & Garibal- di, A. (1990). Impact of group processing on achieve- ment in cooperative groups. The Journal of Social Psychology, 130(4), 507–516. Jonassen, D.H. (1997). Instructional design models for well-structured and ill-structured problem-solving learning outcomes. Educational Technology Research and Development, 45(1), 65–94. Jonassen, D.H., Beissner, K., & Yacci, M. (1993). Struc- tural knowledge. Hillsdale, NJ: Lawrence Erlbaum Associates. King, A. (1989). Verbal interaction and problem solv- ing within computer-assisted cooperative learning group. Journal of Educational Computing Research. 5(1), 1–15. King, A. (1991). Effects of training in strategic ques- tioning on children’s problem-solving performance. Journal of Educational Psychology, 83(3), 307–317. King, A. (1992). Facilitating elaborative learning through guided student-generated questioning. Educational Psychologist, 27(1), 111–126. King, A. (1994). Guiding knowledge construction in the classroom: Effects of teaching children how to question and how to explain. American Educational Research Journal, 31(2), 338–368. King, A., & Rosenshine, B. (1993). Effect of guided cooperative questioning on children’s knowledge construction. Journal of Experimental Education, 61(2), 127–148. Kitchner, K.S. (1983). Cognition, metacognition, and epistemistic cognition: A three-level model of cogni- tive processing. Human Development, 26, 222–232. Kitchner, K.S., & King, P.M. (1981). Reflective judg- ment: Concepts of justification and their relationship to age and education. Journal of Applied Developmen- tal Psychology, 2, 89–116. Land, S.M. (2000). Cognitive requirements for learning with open-ended learning environments. Education- al Technology Research and Development, 48(3), 61–78. Lin, X., Hmelo, C., Kinzer, C.K., & Secules, T.J. (1999). Designing technology to support reflection. Educa- tional Technology Research and Development, 47(3), 43– 62. Lin, X., & Lehman, J.D. (1999). Supporting learning of variable control in a computer-based biology en- vironment: Effects of prompting college students to reflect on their own thinking. Journal of Research in Science Teaching, 3(7), 837–858. Lou, Y., Abrami, P.C., & d’Apollonia, S. (2001). Small group and individual learning with technology: A meta-analysis. Review of Educational Research, 71(3), 449–521. Miles, M.B., & Huberman, A.M. (Eds.). (1994). An ex- panded sourcebook: Qualitative data analysis (2nd ed.). Thousand Oaks, CA: Sage Publications. Osman, M.E., & Hannafin, M.J. (1994). Effects of ad- vance questioning and prior knowledge on science learning. Journal of Educational Research, 88(1), 5–13. Palincsar, A.S., & Brown, A.L. (1984). Reciprocal teach- ing of comprehension-fostering and comprehen- sion-monitoring activities. Cognition and Instruction, 2, 117–175. Palincsar, A.S., Brown, A.L., & Martin, S.M. (1987). Peer interaction in reading comprehension instruc- tion. Educational Psychologist, 22(3–4), 231–253. Pea, R. (1993). Practices of distributed intelligence and designs for education. In G. Salomon (Ed.), Dis- tributed cognitions: Psychological and educational con- siderations (pp. 47–87). Cambridge, UK: Cambridge University Press. Perkins, D.N. (1993). Persons-plus: A distributed view of thinking and learning. In G. Salomon (Ed.), Dis- tributed cognitions: Psychological and educational con- siderations (pp. 88–110). Cambridge, UK: Cambridge University Press. Pressley, M., & McCormick, C.B. (1987). Advanced educational psychology for educators, researchers, and policy makers. New York: HarperCollins. Roschelle, J. (1992). Learning by collaborating: Conver- gent conceptual change. Journal of Learning Sciences, 2, 235–276. Rosenshine, B., Meister, C., & Chapman, S. (1996). Teaching students to generate questions: A review of the intervention studies. Review of Educational Re- search, 66(2), 181–221. Salomon, G. (1993). No distribution without individuals’ cognition: A dynamic interactional view. In G. Salomon (Ed.), Distributed cognitions: Psychological and educational considerations (pp. 111– 138). Cambridge, UK: Cambridge University Press. Scardamalia, M., Bereiter, C., McLean, R.S., Swallow, J., & Woodruff, E. (1989). Computer-supported in- tentional learning environments. Journal of Educa- tional Computing Research, 5, 51–68. Scardamalia, M., Bereiter, C., & Steinbach, R. (1984). Teachability of reflective processes in written com- position. Cognitive Science, 8, 173–190. Schoenfeld, A.H. (1985). Mathematical problem-solving. San Diego, CA: Academic Press. Sinnott, J.D. (1989). A model for solution of ill-struc- tured problems: Implications for everyday and abstract problem solving. In J.D. Sinott (Ed.), Everyday problem solving: Theory and application (pp. 72–99). New York: Praeger. Slavin, R.E. (1989). Cooperative learning and student achievement. In R.E. Slavin (Ed.), School and class- room organization (pp. 129–156). Hillsdale, NJ: Lawrence Erlbaum Associates. Stake, R.E. (2000). Case studies. In N.K. Denzin & Y.S. 36 ETR&D, Vol. 51, No. 1
  • 17. Lincoln (Eds.), Handbook of qualitative research (2nd ed., pp. 435–454). Thousand Oaks, CA: Sage Publica- tions. Strauss, A., & Corbin, J. (Eds.). (1998). Basics of qualita- tive research: Techniques and procedures for developing grounded theory. Thousand Oaks, CA: Sage Publica- tions. van Zee, E., & Minstrell, J. (1997). Using questioning to guide student thinking. The Journal of the Learning Sciences, 6(2), 227–269. Voss, J.F. (1988). Problem solving and reasoning in ill- structured domains. In C. Antaki (Ed.), Analyzing everyday explanation: A casebook of methods (pp. 74– 93). London: Sage Publications. Voss, J.F., & Post, T.A. (1988). On the solving of ill- structured problems. In M.H. Chi, R. Glaser, & M.J. Farr (Eds.), The nature of expertise (pp. 261–285). Hillsdale, NJ: Lawrence Erlbaum Associates. Voss, J.F., Wolfe, C.R., Lawrence, J.A., & Engle, R.A. (1991). From representation to decision: An analysis of problem solving in international relations. In R.J. Sternberg & P.A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 119–158). Hillsdale, NJ: Lawrence Erlbaum Associates. Vygotsky, L.S. (1978). Mind in society. Cambridge, MA: Harvard University Press. Webb, N.M. (1982). Group composition, group interac- tion and achievement in cooperative small groups. Journal of Educational Psychology, 74, 475–484. Webb, N.M. (1989). Peer interaction and learning in small groups. International Journal of Educational Re- search, 13, 21–39. Webb, N.M., & Palincsar, A.S. (1996). Group processes in the classroom. In D.C. Berliner & R.C. Calfee (Eds.), Handbook of educational psychology (pp. 841– 873). New York: Simon & Schuster Macmillan. Wineburg, S.S. (1998). Reading Abraham Lincoln: An expert-expert study in the interpretation of historical texts. Cognitive Science, 22, 319–346. Wong, B.Y.L. (1985). Self-questioning instructional re- search: A review. Review of Educational Research, 55, 227–268. Yin, R.K. (1989). Case study research: Design and methods (2nd ed.). Thousand Oaks, CA: Sage Publications. Appendix A The Question Prompt Treatment Material Something to Think About . . . As you work through the problem, please read and think about the following questions. How do I define the problem? 1. What are the parts of the problem? 2. What are the technical components? 3. What information do you need for this system? How will the system be used, by whom, and for what? • Who would be the users? • What information do you expect to be needed by the users? • What level of prior knowledge do you expect the users to have? • How would a user ideally interact with the proposed system? What solutions do I need to generate? 4. What should the system do? 5. How should the different technical components of the proposed system interrelate? 6. What are the risks? What are my reasons or what is my argument for my proposed solution? 7. How would I justify this specific system design? For example, if I develop a web-based solution, can I explain why I took that approach? 8. Do I have evidence to support my solution (that is, the specific IT system I have proposed)? What is my chain of reasoning to support my solution? Am I on the right track? 9. Have I discussed both the technical components and the issues with use, for example, usability and effectiveness? 10. Are there alternative solutions? • What are they? • How are they compared with my proposed system? • What argument can I make or what evidence do I have to convince the manager that my solution is the most viable? SCAFFOLDING ILL-STRUCTURED PROBLEM SOLVING 37
  • 18. Appendix B Data display example: The effects of peer interactions on problem-solving processes. Processes or Peer Interactions Reactions or Consequences Data Sources Thinking Influenced Brainstorm Come up with different ideas to solve Interviews; Problem representation the problem (e.g., Case 5, 6, 7) Observation (cognitive thinking) Ask questions Explain, e.g.,: Developing solutions Case 8: explain what PDA is Observation (cognitive thinking) Case 5: what server is Interview Making justifications Examine thinking process, solutions, etc.: (metacognitive skills) Case 7: cause one to examine the Interview Monitoring and feasibility of a solution evaluating solution Case 8: evaluate the solutions, modify Observation process the solution accordingly (metacognitive skills) Provide feedback Monitor thinking process, e.g.; Monitoring and Case 6: Examine pros and cons, decide Interviews evaluating solution what systems to use process (e.g., external vs. internal) (metacognitive skills) Case 5: Test the system Reflect on one’s thinking, e.g.: Case 5: See things that could not have Interviews thought about Elaborate ideas Build on each other’s ideas Observations Developing solutions (e.g., Case 7, 8, 6) Interviews (cognitive thinking) Problem representation (cognitive thinking) Make suggestions Build on each other’s ideas for developing Observations Developing solutions solutions (Case 7, 8, 6) Interviews (cognitive thinking) See things from other perspectives Monitoring and Think about things that could not have Interview evaluating solution been thought about (Case 5) process (metacognitive skills) Share ideas Get multiple perspectives (Case 5, 6, 8) Interviews Monitoring and Share expertise: evaluating solution Case 6: my ideas combined with other Interviews process people’s ideas (metacognitive skills) Cases 7, 8: take expertise from Developing solutions each other (cognitive thinking) 38 ETR&D, Vol. 51, No. 1